Redis Sink Connector

Type

sink

Class

com.redis.kafka.connect.RedisSinkConnector

Target System

Redis

Maintainer

Redis

License

Apache License 2.0

Project

github.com/redis-field-engineering/redis-kafka-connect

Download

GitHub Releases

This page documents version 1.1.0. Newer versions should be compatible unless there are breaking changes, but field names or default values may differ. If you notice discrepancies, please contact Axual Support.

Description

The Redis Sink Connector consumes records from Kafka topics and writes them into a Redis database.

It is maintained by Redis as part of the open-source github.com/redis-field-engineering/redis-kafka-connect.

Features

  • Write Kafka records into Redis

  • Configurable Redis URI with full connection details

  • Supports Redis Cloud and self-managed Redis instances

When to Use

  • You need to write Kafka topic data into Redis for caching or real-time lookups.

  • You want to materialise a Kafka topic into Redis keys.

When NOT to Use

  • Your Redis instance is not reachable from the Kafka Connect cluster.

  • You need to read data from Redis into Kafka.

Installation

The connector is available from the GitHub Releases.

  1. Navigate to the releases page and select the version matching your Kafka Connect installation.

  2. Download the JAR file.

For installation steps, see Installing Connector Plugins.

Configuration

For the complete configuration reference, see the official sink connector documentation.

To configure a connector in Axual Self-Service, see Starting Connectors. TIP: For Infrastructure-as-Code deployment, see the Axual Kafka Connect Boilerplates for Terraform and Management API boilerplates.

Getting Started

Prerequisites

Redis instance

  1. Sign up for a free trial account with Redis Cloud.
    Click "Let’s start free". A Redis instance will be created automatically.

  2. Go to the Subscriptions page.
    Click the database that was created.

    • Note down the Username.

    • Scroll to Default user password and click Copy. Note down the password.

Axual stream

The Kafka stream this connector will consume must already exist and contain records in Axual Self-Service. See Creating streams if you need to create it.

Steps

Step 1 — Create a connector application

  1. Follow the Creating streams documentation in order to create one stream and deploy it onto an environment.
    The name of the stream will be my_redis_stream.
    The key/value types will be JSON/JSON.

  2. Produce some data as JSON/JSON events to this stream.
    It is not important what message key you use: only the value matters.

  3. Follow the Configure and install a connector documentation to set up a new Connector-Application.
    Let’s call it my_redis_sink.
    The plugin name is com.redis.kafka.connect.RedisSinkConnector.
    If a plugin isn’t available, ask a platform operator to install plugins.

Step 2 — Configure the connector

  1. Provide the following minimal configuration:

    topics

    my_redis_stream

    connect.redis.uri

    Example value:
    redis://[username]:[password]@redis-18275.c56.east-us.azure.redns.redis-cloud.com:18275

    connect.redis.password

    The default user password noted down above.
    Example value: 1234567890ABCDefghijklmnopqrstuvwxyz

    connect.redis.username

    The default username noted down above.
    Example value: default

    For advanced options, see the official sink connector documentation.

  2. Authorize the my_redis_sink sink Connector-Application to consume the my_redis_stream stream.

Step 3 — Start the connector

Start the connector application from Axual Self-Service.

Step 4 — Verify

Verify that events from Kafka have been written to Redis:

  1. Open the Databases page and click your database.

  2. Click the Metrics tab.

  3. Look for the Total keys graph — you should see the number increase from zero.

Cleanup

When you are done:

  1. Stop the connector application in Axual Self-Service.

  2. Remove stream access for the application if no longer needed.

  3. Go to the Subscriptions page → scroll to Danger zoneDelete databaseDelete both.

Known limitations

No known limitations at this time.

Examples

Minimal configuration

{
  "name": "my-redis-sink",
  "config": {
    "connector.class": "com.redis.kafka.connect.RedisSinkConnector",
    "topics": "my_redis_stream",
    "connect.redis.uri": "redis://_default_:RedisSecure123@redis-18275.c56.east-us.azure.redns.redis-cloud.com:18275",
    "connect.redis.password": "RedisSecure123",
    "connect.redis.username": "_default_"
  }
}

License

Redis sink connector is licensed under the Apache License, Version 2.0.