Connectors

Supported Connectors

Connector Name Supplier Version Connector Type (Source/Sink) Documentation Source Code Description

JDBC

Confluent

5.0.4

Source & Sink

Link

Link

JDBC source connector to import data from any relational database with a JDBC driver into Kafka topics.
JDBC sink connector to export data from Kafka topics to any relational database with a JDBC driver.
Note : supported only for mariadb

Snowflake

Snowflake

1.4.0

Sink

Link

Link

This connector export data from one or more Kafka topics and loads the data into a Snowflake table.

Amazon S3

Confluent

5.0.4

Sink

Link

Link

This connector allows you to export data from Kafka topics to Amazon S3.
Important: Your Connect cluster should be configure to use S3. Contact your operator before using this connector.

Connectors Maintenance

Adding New Connector Plugins

  1. Repeat the following steps for each machine where Connect is running.

    1. Copy the Connector JARs files to the path define in CONNECT_PLUGINS_DIR_PATH and create a new directory.
      If multiple jars are required for a plugin (like JDBC Connector and JDBC Drivers), put them together in one directory
      For example:

      [CONNECT_PLUGINS_DIR_PATH]
      ├── jdbc
      │   ├── jdbc-connector.jar
      │   └── mariadb-jdbc.jar
      ├── snowflake
      │   ├── snowflake-connector.jar
      │   └── bouncy-castle.jar
      └─ [NEW PLUGIN NAME]
          ├── new-plugin.jar
          └── helper.jar
    2. Restart Connect

      axual.sh [start|restart|stop] client <instance-name> axual-connect
      Restart Connect services in a rolling fashion, otherwise there will be down time
  2. Continue with Creating application Connector type Creating A Connector Application

To share Java resources / configuration / certificate, see CONNECT_COMMON_CLASSES_PATH,CONNECT_CONFIG_DIR_PATH, CONNECT_CLIENT_CERTS_PATH configurations
You will require restarting Connect to make those resources available.