Cassandra Sink Connector

Type

Sink

Class

io.lenses.streamreactor.connect.cassandra.sink.CassandraSinkConnector

Target System

Database (Cassandra)

Maintainer

Lenses.io (Stream Reactor)

License

Apache License 2.0

Project

github.com/lensesio/stream-reactor

Download

GitHub Releases

This page documents version 9.0.2. Newer versions should be compatible unless there are breaking changes, but field names or default values may differ. If you notice discrepancies, please contact Axual Support.

Description

The Cassandra Sink Connector consumes records from Kafka topics and writes them into Cassandra tables. It uses KCQL (Kafka Query Language) to define how Kafka topics map to Cassandra tables and configure the write mode.

It is part of the open-source Lenses Stream Reactor project.

Features

  • Write Kafka records into Cassandra tables

  • KCQL-based topic-to-table mapping

  • Configurable write modes via KCQL (INSERT, UPSERT)

  • Supports SSL and authentication

When to Use

  • You need to write Kafka records into a Cassandra table as part of a data pipeline.

  • You want to materialise a Kafka topic as a Cassandra table for downstream querying.

When NOT to Use

  • Your Kafka records do not carry a consistent schema — KCQL mappings expect a defined structure.

  • Your Cassandra table schema changes frequently — mappings require manual reconfiguration.

Installation

The connector is available from the Lenses Stream Reactor releases page.

  1. Navigate to the releases page and select version 9.0.2.

  2. Download the Cassandra connector JAR file.

For installation steps, see Installing Connector Plugins.

Configuration

For the complete configuration reference, see the official sink connector documentation.

To configure a connector in Axual Self-Service, see Starting Connectors. TIP: For Infrastructure-as-Code deployment, see the Axual Kafka Connect Boilerplates for Terraform and Management API boilerplates.

Getting Started

This section walks through connecting the Cassandra Sink Connector to a Cassandra instance and writing Kafka records into a table.

Prerequisites

Cassandra instance

You need a running Cassandra instance reachable from the Kafka Connect cluster, with a keyspace and a target table that matches the schema of the Kafka records.

See the Cassandra Source Connector prerequisites for setup instructions.

Axual stream with records

The Kafka stream this connector consumes must already exist and contain records. If you followed the Cassandra Source Connector guide, the long_tutorials_kafka stream is already populated.

Steps

Step 1 — Create a connector application

Unresolved include directive in modules/ROOT/pages/connect-plugins-catalog/cassandra/sink.adoc - include::../../_includes/create-an-application.adoc[]

Step 2 — Configure the connector

  1. Provide the following minimal configuration:

    connector.class

    io.lenses.streamreactor.connect.cassandra.sink.CassandraSinkConnector

    connect.cassandra.contact.points

    PASTE_THE_IP_ADDRESS

    connect.cassandra.port

    9042

    connect.cassandra.username

    cassandra

    connect.cassandra.password

    Your Cassandra password

    connect.cassandra.key.space

    my_keyspace_name

    connect.cassandra.kcql

    INSERT INTO long_tutorials_duplicate SELECT * FROM long_tutorials_kafka PK id

    topics

    long_tutorials_kafka

    For advanced options, see the official sink connector documentation.

Step 3 — Start the connector

Start the connector application from Axual Self-Service. Once running, records from the stream will be written to the configured Cassandra table.

Step 4 — Verify

Connect to your Cassandra instance and query the target table to confirm records have arrived:

SELECT * FROM my_keyspace_name.long_tutorials_duplicate;

Cleanup

  1. Stop the connector application in Axual Self-Service.

  2. Remove stream access for the application if no longer needed.

  3. Delete your Cassandra instance if it was created only for testing.

Known limitations

  • KCQL mappings require manual reconfiguration when the Kafka record schema or Cassandra table schema changes.

  • The connector does not support Cassandra conditional writes (lightweight transactions).

The connector class io.lenses.streamreactor.connect.cassandra.sink.CassandraSinkConnector is marked deprecated by Lenses.io as of Stream Reactor 10. The replacement class (io.lenses.streamreactor.connect.cassandra.CassandraSinkConnector) is only available in Stream Reactor 10+, which requires Kafka 4.x. As long as Axual Connect runs on Kafka 3.x, Stream Reactor 9.x is the correct version and this class name is the only valid option.

Examples

Minimal configuration

{
  "name": "my-cassandra-sink",
  "config": {
    "connector.class": "io.lenses.streamreactor.connect.cassandra.sink.CassandraSinkConnector",
    "connect.cassandra.contact.points": "123.123.123.123",
    "connect.cassandra.port": "9042",
    "connect.cassandra.username": "cassandra",
    "connect.cassandra.password": "<your-cassandra-password>",
    "connect.cassandra.key.space": "my_keyspace_name",
    "connect.cassandra.kcql": "INSERT INTO long_tutorials_duplicate SELECT * FROM long_tutorials_kafka PK id",
    "topics": "long_tutorials_kafka"
  }
}

License