Azure Cosmos DB Source Connector
Type |
source |
Class |
|
Target System |
Azure Cosmos DB |
Maintainer |
Microsoft |
License |
MIT License |
Project |
|
Download |
|
This page documents version 1.19.0. Newer versions should be compatible unless there are breaking changes, but field names or default values may differ. If you notice discrepancies, please contact Axual Support. |
Description
The Azure Cosmos DB Source Connector reads documents from an Azure Cosmos DB container and publishes them as records to Kafka topics.
It is maintained by Microsoft as part of the open-source github.com/microsoft/kafka-connect-cosmosdb.
Features
-
Stream documents from Azure Cosmos DB containers to Kafka topics
-
Topic-to-container mapping via configurable
topicmap -
Supports JSON record formats
When to Use
-
You need to ingest Azure Cosmos DB documents into Kafka for downstream processing.
-
You want to replicate Cosmos DB changes into Kafka topics.
When NOT to Use
-
You need to write data into Cosmos DB — use the Azure Cosmos DB Sink Connector instead.
Installation
The connector is available from the GitHub Releases.
-
Navigate to the releases page and select the version matching your Kafka Connect installation.
-
Download the JAR file.
For installation steps, see Installing Connector Plugins.
Configuration
For the complete configuration reference, see the official source connector documentation.
| To configure a connector in Axual Self-Service, see Starting Connectors. TIP: For Infrastructure-as-Code deployment, see the Axual Kafka Connect Boilerplates for Terraform and Management API boilerplates. |
Getting Started
Prerequisites
Azure Cosmos DB account
-
You already have an Azure Cosmos DB Account.
-
You have an Azure Cosmos DB container (e.g.
cosmosdb-testing) with documents to ingest. -
You have access to the master key (Azure Cosmos DB primary key).
Axual stream
The stream where the connector will produce events must already exist in Axual Self-Service. See Creating streams if you need to create it.
Steps
Step 1 — Create a connector application
-
Follow the Creating streams documentation in order to create one stream and deploy it onto an environment.
The name of the stream will beapparels.
The key/value types will beJSON/JSON. -
Follow the Configure and install a connector documentation to set up a new Connector-Application.
Let’s call itmy-custom-cosmosdb-instance.
The plugin name iscom.azure.cosmos.kafka.connect.source.CosmosDBSourceConnector.
If a plugin isn’t available, ask a platform operator to install plugins.
Step 2 — Configure the connector
-
Provide the following minimal configuration:
key.converterorg.apache.kafka.connect.json.JsonConvertertopicsapparelsvalue.converterorg.apache.kafka.connect.json.JsonConverterconnect.cosmos.connection.endpointconnect.cosmos.master.keyInsert Azure CosmosDB Master Key
connect.cosmos.containers.topicmapFormat is topic#containername. Example:
apparels#cosmosdbtestingconnect.cosmos.databasenameInsert name of database. Example:
cosmosdbtestingFor advanced options, see the official source connector documentation.
-
Authorize the
my-custom-cosmosdb-instancesource Connector-Application to produce to theapparelsstream.
Examples
Minimal configuration
{
"name": "my-cosmosdb-source",
"config": {
"connector.class": "com.azure.cosmos.kafka.connect.source.CosmosDBSourceConnector",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"topics": "apparels",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"connect.cosmos.connection.endpoint": "https://my-cosmos-instance.documents.azure.com:443/",
"connect.cosmos.master.key": "<your-cosmos-master-key>",
"connect.cosmos.containers.topicmap": "apparels#cosmosdbtesting",
"connect.cosmos.databasename": "cosmosdbtesting"
}
}
License
Azure Cosmos DB source connector is licensed under the MIT license.