Neo4j Sink Connector

Type

Sink

Class

org.neo4j.connectors.kafka.sink.Neo4jConnector

Target System

Graph Database (Neo4j / Aura)

Maintainer

Neo4j

License

Apache License 2.0

Project

GitHub repository

Download

GitHub Releases

This page documents version 5.3.0. Newer versions should be compatible unless there are breaking changes, but field names or default values may differ. If you notice discrepancies, please contact Axual Support.

Description

The Neo4j Sink Connector consumes records from Kafka topics and writes them into a Neo4j or Aura database using Cypher statements.

These connectors are developed and maintained by Neo4j as part of the Neo4j Kafka Connector.

Features

  • Write Kafka records into Neo4j/Aura using topic-to-Cypher mappings

  • Flexible Cypher templates allow full control over graph write operations

  • Supports both Neo4j self-managed and Aura cloud databases

  • Works with standard Kafka Connect converters

When to Use

  • You need to ingest Kafka topic data into a Neo4j graph database.

  • You want to model Kafka events as nodes or relationships in a graph.

When NOT to Use

  • You need to read data from Neo4j into Kafka — use the Neo4j Source Connector instead.

  • You have not yet configured a Neo4j or Aura database instance.

Installation

The connector is available from the GitHub Releases.

  1. Navigate to the releases page and select the version matching your Kafka Connect installation.

  2. Download the JAR file.

For installation steps, see Installing Connector Plugins.

Configuration

For the complete configuration reference, see the official sink connector documentation.

To configure a connector in Axual Self-Service, see Starting Connectors. TIP: For Infrastructure-as-Code deployment, see the Axual Kafka Connect Boilerplates for Terraform and Management API boilerplates.

Getting Started

Prerequisites

Neo4j or Aura database

You need a running Neo4j or Aura database reachable from the Kafka Connect cluster. Consult the official connector documentation for environment setup instructions.

Axual stream

The stream the connector will consume must already exist in Axual Self-Service and contain records. See Creating streams if you need to create it.

Steps

Step 1 — Create a connector application

  1. Follow the Configure and install a connector documentation to set up a new Connector-Application.
    Let’s call it my_neo4j_sink.
    The plugin name is org.neo4j.connectors.kafka.sink.Neo4jConnector.
    If a plugin isn’t available, ask a platform operator to install plugins.

Step 2 — Configure the connector

  1. Provide the following minimal configuration:

    connector.class

    org.neo4j.connectors.kafka.sink.Neo4jConnector

    topics

    my_neo4j_topic

    neo4j.uri

    neo4j://neo4j.example.com:7687

    neo4j.authentication.type

    BASIC

    neo4j.authentication.basic.username

    Your Neo4j username

    neo4j.authentication.basic.password

    Your Neo4j password

    neo4j.cypher.topic.my_neo4j_topic

    Cypher statement to execute per record, e.g.
    MERGE (n:Node {id: event.id}) SET n += event

    neo4j.cypher.bind-value-as

    event

    key.converter

    org.apache.kafka.connect.storage.StringConverter

    value.converter

    org.apache.kafka.connect.json.JsonConverter

    value.converter.schemas.enable

    false

    For advanced options, see the official sink connector documentation.

  2. Authorize the my_neo4j_sink sink Connector-Application to consume the my_neo4j_topic stream.

Step 3 — Start the connector

Start the connector application from Axual Self-Service.

Step 4 — Verify

Verify that records have been written to Neo4j by querying the database directly.

Cleanup

When you are done:

  1. Stop the connector application in Axual Self-Service.

  2. Remove stream access for the application if no longer needed.

Known limitations

  • A configured Neo4j or Aura database is required before starting this connector. No setup guidance is provided in this document — refer to the official connector documentation.

  • Each Kafka topic requires a dedicated neo4j.cypher.topic.<topic> property — there is no wildcard topic-to-Cypher mapping.

Examples

Minimal configuration

{
  "name": "my-neo4j-sink",
  "config": {
    "connector.class": "org.neo4j.connectors.kafka.sink.Neo4jConnector",
    "topics": "my-topic",
    "neo4j.uri": "neo4j://neo4j.example.com:7687",
    "neo4j.authentication.type": "BASIC",
    "neo4j.authentication.basic.username": "neo4j",
    "neo4j.authentication.basic.password": "<your-password>",
    "neo4j.cypher.topic.my-topic": "MERGE (n:Node {id: event.id}) SET n += event",
    "neo4j.cypher.bind-value-as": "event",
    "key.converter": "org.apache.kafka.connect.storage.StringConverter",
    "value.converter": "org.apache.kafka.connect.json.JsonConverter",
    "value.converter.schemas.enable": "false"
  }
}

License

Neo4j source/sink connector is licensed under the Apache License, Version 2.0.