Example configurations for the Kafka Sink Connector

Example 1 - Minimal configuration

No static headers will be created and the default metadata headers will be send.
The source topic selector and source partition selectors are used. These selectors require no further configuration, but do require that the topic exists in the remote systems with the same number of partitions.

Unresolved include directive in modules/ROOT/pages/developer/connect-plugins-catalog/axual/kafka-sink/examples.adoc - include::example$/kafka-sink/minimal-config.json[]

Example 2 - Minimal configuration using TLS Client certificate

No static headers will be created and the default metadata headers will be send.
The source topic selector and source partition selectors are used. These selectors require no further configuration, but do require that the topic exists in the remote systems with the same number of partitions.
The remote cluster uses TLS and requires a client certificate to be provided.

Unresolved include directive in modules/ROOT/pages/developer/connect-plugins-catalog/axual/kafka-sink/examples.adoc - include::example$/kafka-sink/minimal-with-tls-config.json[]

Example 3 - Using static headers

Two static headers are added to each produced record.

Unresolved include directive in modules/ROOT/pages/developer/connect-plugins-catalog/axual/kafka-sink/examples.adoc - include::example$/kafka-sink/static-headers-config.json[]

Example 4 - Using metadata forwarding

Each produced record contains the metadata from the original record using custom header names.

Unresolved include directive in modules/ROOT/pages/developer/connect-plugins-catalog/axual/kafka-sink/examples.adoc - include::example$/kafka-sink/metadata-forwarding-config.json[]

Example 5 - Using source topic selector

Each record is written to the same topic as the original record.+ This configuration is functionally identical to the minimal configuration

Unresolved include directive in modules/ROOT/pages/developer/connect-plugins-catalog/axual/kafka-sink/examples.adoc - include::example$/kafka-sink/topic-selector-source-config.json[]

Example 6 - Using fixed topic selector

Each record is written to the remote topic forwarded-to-us, regardless from which local topic is was read.

Unresolved include directive in modules/ROOT/pages/developer/connect-plugins-catalog/axual/kafka-sink/examples.adoc - include::example$/kafka-sink/topic-selector-fixed-config.json[]

Example 7 - Using prefix topic selector

This example produces records to the same topic on the remote cluster, but prefixed with the supplied value.
The selector configuration sets the prefix to forwarded-.
In this case the connector only reads from topic test-topic, resulting in the target topic name forwarded-test-topic.

Unresolved include directive in modules/ROOT/pages/developer/connect-plugins-catalog/axual/kafka-sink/examples.adoc - include::example$/kafka-sink/topic-selector-prefix-config.json[]

Example 8 - Using mapping topic selector

This example produces records to the same topic on the remote cluster.
In this case the connector only reads from topic test-topic, resulting in the target topic name forwarded-from-test-topic. The other mapping is for local topic topic2 to target topic another-out.

Unresolved include directive in modules/ROOT/pages/developer/connect-plugins-catalog/axual/kafka-sink/examples.adoc - include::example$/kafka-sink/topic-selector-mapping-config.json[]

Example 9 - Using source partition selector

Each record is written to the same partition as the original record.
This example is functionally identical to the Minimal configuration example.

Unresolved include directive in modules/ROOT/pages/developer/connect-plugins-catalog/axual/kafka-sink/examples.adoc - include::example$/kafka-sink/partition-selector-source-config.json[]

Example 10 - Using partitioner partition selector

The partition of each record produced to the remote cluster is determined by the Partitioner class used by the producer.
The remote.partitioner.class configuration is set to use the DefaultPartitioner of Kafka to determine the target partition.

Unresolved include directive in modules/ROOT/pages/developer/connect-plugins-catalog/axual/kafka-sink/examples.adoc - include::example$/kafka-sink/partition-selector-partitioner-config.json[]

Example 11 - Prefixing headers

Kafka headers will be forwarded, with their names prefixed with X-Original- in the produced SinkRecord.

Unresolved include directive in modules/ROOT/pages/developer/connect-plugins-catalog/axual/kafka-sink/examples.adoc - include::example$/kafka-sink/prefix-headers-config.json[]