Cassandra Source Connector

Type

Source

Class

io.lenses.streamreactor.connect.cassandra.source.CassandraSourceConnector

Target System

Database (Cassandra)

Maintainer

Lenses.io (Stream Reactor)

License

Apache License 2.0

Project

github.com/lensesio/stream-reactor

Download

GitHub Releases

This page documents version 9.0.2. Newer versions should be compatible unless there are breaking changes, but field names or default values may differ. If you notice discrepancies, please contact Axual Support.

Description

The Cassandra Source Connector polls Cassandra tables and publishes rows as records to Kafka topics. It uses KCQL (Kafka Query Language) to define which tables to read and how to map them to Kafka topics.

It is part of the open-source Lenses Stream Reactor project.

Unlike relational databases, Cassandra does not support log-based CDC in the same way as PostgreSQL or MySQL. This connector uses a polling approach — it periodically queries a table and publishes new or updated rows using either bulk mode (all rows) or incremental mode (rows newer than the last checkpoint).

Features

  • Poll Cassandra tables and publish rows to Kafka topics

  • KCQL-based table-to-topic mapping

  • Configurable polling mode: bulk or incremental

  • Supports SSL and authentication

When to Use

  • You need to ingest rows from a Cassandra table into Kafka.

  • Your use case tolerates polling latency — near-real-time is sufficient.

When NOT to Use

  • You need real-time, low-latency change capture — Cassandra does not support CDC in the same way as relational databases.

  • Your Cassandra schema changes frequently — KCQL mappings require manual reconfiguration on schema changes.

Installation

The connector is available from the Lenses Stream Reactor releases page.

  1. Navigate to the releases page and select version 9.0.2.

  2. Download the Cassandra connector JAR file.

For installation steps, see Installing Connector Plugins.

Configuration

For the complete configuration reference, see the official source connector documentation.

To configure a connector in Axual Self-Service, see Starting Connectors. TIP: For Infrastructure-as-Code deployment, see the Axual Kafka Connect Boilerplates for Terraform and Management API boilerplates.

Getting Started

This section walks through connecting the Cassandra Source Connector to a Cassandra instance and publishing rows to an Axual stream.

Prerequisites

Cassandra instance

You need a running Cassandra instance reachable from the Kafka Connect cluster, with a keyspace and table containing data to publish.

If you do not have a Cassandra instance yet, you can provision one on Google Cloud using the Cassandra Cluster packaged by Bitnami.

Axual stream

The stream where the connector will produce events must already exist in Axual Self-Service. See Creating streams if you need to create it.

Steps

Step 1 — Create a connector application

Unresolved include directive in modules/ROOT/pages/connect-plugins-catalog/cassandra/source.adoc - include::../../_includes/create-an-application.adoc[]

Step 2 — Configure the connector

  1. Provide the following minimal configuration:

    connector.class

    io.lenses.streamreactor.connect.cassandra.source.CassandraSourceConnector

    connect.cassandra.contact.points

    PASTE_THE_IP_ADDRESS

    connect.cassandra.port

    9042

    connect.cassandra.username

    cassandra

    connect.cassandra.password

    Your Cassandra password

    connect.cassandra.key.space

    my_keyspace_name

    connect.cassandra.kcql

    INSERT INTO long_tutorials_kafka SELECT * FROM long_tutorials PK id

    For advanced options, see the official source connector documentation.

Step 3 — Start the connector

Start the connector application from Axual Self-Service. Once running, rows from the configured Cassandra table will be published to the stream.

Step 4 — Verify

In Axual Self-Service, use stream-browse on the target stream to confirm events are arriving.

Cleanup

  1. Stop the connector application in Axual Self-Service.

  2. Remove stream access for the application if no longer needed.

  3. Delete your Cassandra instance if it was created only for testing.

Known limitations

  • The connector uses polling — it does not detect deletes or updates unless the table has a timestamp column and incremental mode is configured.

  • KCQL mappings require manual reconfiguration when the Cassandra table schema changes.

  • With minimal configuration, the connector may produce duplicate records — use a primary key in the KCQL PK clause to reduce this risk.

The connector class io.lenses.streamreactor.connect.cassandra.source.CassandraSourceConnector is the correct class for Stream Reactor 9.x. Stream Reactor 10+ (which requires Kafka 4.x) introduces a rewritten Cassandra connector under a different class name. As long as Axual Connect runs on Kafka 3.x, this class name is the correct and only valid option.

Examples

Minimal configuration

{
  "name": "my-cassandra-source",
  "config": {
    "connector.class": "io.lenses.streamreactor.connect.cassandra.source.CassandraSourceConnector",
    "connect.cassandra.contact.points": "123.123.123.123",
    "connect.cassandra.port": "9042",
    "connect.cassandra.username": "cassandra",
    "connect.cassandra.password": "<your-cassandra-password>",
    "connect.cassandra.key.space": "my_keyspace_name",
    "connect.cassandra.kcql": "INSERT INTO long_tutorials_kafka SELECT * FROM long_tutorials PK id"
  }
}

License