Enabling Connector logging into Kafka

As of Axual Platform 2023.2, it is possible to view connector logging in the Self-Service interface. Moreover, it is possible to offload logging data to an external system by using a connector.

Enabling connector logging is done on multiple levels.

  • Connector logging support is enabled on the Self-Service interface. This is done per instance.

  • A Kafka log appender is configured for Axual Connect. This ensures that Axual Connect writes all log messages to a Kafka topic per connector application per environment.

Connector logging is only available for Axual Connect instances which have static configuration, see also below
Connector logging is only available for new Connector applications

Step 1: Enabling connector logging support in Self-Service

Prerequisites: you are running Axual Platform 2023.2 or higher (see Releases of Axual Platform) for versions.

To enable connector logging support:

  1. Update the value.yaml for Axual Platform, more specifically the part related to connector logging. Make sure that connectorLoggingEnabled is set to true and restart the UI service.

    connectorLogging:
      enabled: true
  2. Log in to Self-Service as someone with the TENANT_ADMIN role

  3. Go to the respective instance page

  4. Click "Edit instance"

  5. Make sure that "Enable connect" is enabled for that instance

  6. Enable "Connector Logging" for the instance

  7. Upload the certificate which is used by Axual Connect. This is done to ensure that Connect has the right privileges to write to the logging topics. You can find the certificate in the Secret "axual-local-connect-client-certificates" under "tls.crt".

  8. Click "Update instance" to store the configuration

Step 2: Installing the log appender

In order to enable viewing of connector logging in Self Service, the Axual logging appender should be installed, made available as a library to connect and configured in values.yaml In order to install the Axual logging appender, you have to update the commonResourcesFile which is passed to axual-connect to be downloaded upon startup.

  1. Check which reference you have for the commonResourcesFile in the values.yaml that you use for axual-connect-helm

  2. Download the file axual-connect-common-resources-1.0.0.tgz from the artifactsBaseUrl to your local machine and unpack the tarball to a temporary directory, for example:

if for any reason you are using a different (newer) version for the common-resources, download this one instead.
mkdir temp-commons
cd temp-commons
wget [url-to-commons-file]
tar xzf [commons-file]

+ . You will now have unpacked all common libraries, ready to add a new one to it. . Next, add the axual-logging-appender jar file to the directory and compress the tarball again. The URL to the logging-appender is shown in the example below.

+

cd temp-commons
wget https://stpaxualconnect.blob.core.windows.net/axual-0e0tyou2/axual-logging-appenders-1.0.2.jar
rm axual-connect-common-resouces-1.0.0.tgz
tar --disable-copyfile czf axual-connect-common-resources-1.1.0.tgz *
  1. Make the new tarball (axual-connect-common-resources-1.1.0.tgz) available on a web server, as the original one.

  2. Update the values.yaml for axual-connect-helm to use the new tarball

    downloadPlugins:
      artifactsBaseUrl: "[URL_OF_YOUR_FILE_SERVER]" // Do not change this
      connectPluginsFile: "[PATH_TO_YOUR_PLUGINS_TARBALL]" // Do not change this
      commonResourcesFile: "axual-connect-common-resources-1.1.0.tgz"

Step 3: Configuring Connect

Now that the logging appender library is available and configured for connect, it is a requirement to put connect in configuration mode static and enable routedLogging as described below.

  1. In your values file, make sure that routed logging is enabled, and that you are using static configuration mode. Which values to use depend on your installation. If you are unsure, please contact Axual Support.

    You can retrieve the values for the following configuration from Discovery API at https://discovery.endpoint/v2

axual:
  configMode: "static"
  staticConfig:
    bootstrap.servers: "example.host:9092"
    tenant: "axual"
    instance: "local"
    cluster: "local"
    schema.registry.url: "https://platform.local:25000"
    group.id.pattern: "{tenant}-{instance}-{environment}-{group}"
    topic.pattern: "{tenant}-{instance}-{environment}-{topic}"
    transactional.id.pattern: "{tenant}-{instance}-{environment}-{transactional.id}"
    enable.value.headers: "false" // Leave to default
    group.id.resolver: "io.axual.common.resolver.GroupPatternResolver" // Leave to default
    topic.resolver: "io.axual.common.resolver.TopicPatternResolver" // Leave to default
    transactional.id.resolver: "io.axual.common.resolver.TransactionalIdPatternResolver" // Leave to default


routedLogging:
  enabled: true
  suppressEnvironment: false
  pattern: '%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{36} %msg'
  1. Issue a helm upgrade command for Axual-Connect:

    helm upgrade --install -n kafka axual-connect -f [YOUR_CUSTOM_VALUES]
  2. The initialization of the logging appender should be visible in the Connect logging.

    10:19:44,997 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [io.axual.connect.logging.logback.RoutingKafkaAppender]
    10:19:44,999 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [KAFKA]
    10:19:45,001 |-INFO in ch.qos.logback.core.joran.action.NestedComplexPropertyIA - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property
    10:19:45,009 |-INFO in io.axual.connect.logging.logback.RoutingKafkaAppender[KAFKA] - Logger producer properties:
    
       (configured Kafka producer properties are shown here)
    
    10:19:45,150 |-INFO in io.axual.connect.logging.logback.RoutingKafkaAppender[KAFKA] - Producer configuration complete. Starting appender
    10:19:45,150 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [ch.qos.logback.classic.AsyncAppender]
    10:19:45,153 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [ASYNC_KAFKA]
    10:19:45,154 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [KAFKA] to ch.qos.logback.classic.AsyncAppender[ASYNC_KAFKA]
    10:19:45,154 |-INFO in ch.qos.logback.classic.AsyncAppender[ASYNC_KAFKA] - Attaching appender named [KAFKA] to AsyncAppender.
    10:19:45,155 |-INFO in ch.qos.logback.classic.AsyncAppender[ASYNC_KAFKA] - Setting discardingThreshold to 51
    10:19:45,156 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [ASYNC_KAFKA] to Logger[ROOT]
  3. Verify the operational changes by creating a connector application, configuring it in an environment and viewing the logging in the Self-Service interface. For more information, follow the Self-Service docs: create a Connect application in Selfservice.

Optional: additional helm configuration

The following values can be configured through Helm:

Name Setting Mandatory Default value

enabled

Indicates if connector logging is enabled

no

false

suppressEnvironment

Is environment considered when determining log topic name

no

false

pattern

Log message format; connect logging uses Logback Pattern Layout. See the Logback documentation for examples.

yes

none

enableHostnameVerification

In certain configurations, SSL hostname verification can prevent the log appender from producing. In these cases this setting can be used to disable it.

no

true

debugMode

This setting can be used to see in detail what the log appender is doing. When enabled, the appender logs the received Connector context and the derived topic where logging is routed and the produced offset.

Note: connector context and topic are logged per topic until 5 successful log lines are produced on that topic. In production use this can lead to large amount of logging at startup of Connect.

no

false