Google Pub/Sub Source Connector
Type |
Source |
Class |
|
Target System |
Google Cloud Pub/Sub |
Maintainer |
|
License |
Apache License 2.0 |
Project |
|
Download |
|
This page documents version 2.8.2. Newer versions should be compatible unless there are breaking changes, but field names or default values may differ. If you notice discrepancies, please contact Axual Support. |
The GoogleCloudPlatform/pubsub repository used by this connector has been deprecated by Google since 2022.
The successor project is googleapis/java-pubsub-group-kafka-connector.
Migration involves updating the connector class names and download location.
This documentation page has not yet been updated to reflect the new project.
|
Description
The Google Pub/Sub Source Connector reads messages from a Google Cloud Pub/Sub subscription and publishes them as records to a Kafka topic.
It is maintained by Google as part of the open-source github.com/GoogleCloudPlatform/pubsub.
Features
-
Consume messages from a Google Cloud Pub/Sub subscription and publish to Kafka
-
Configurable Kafka partition scheme
-
Supports GCP service account authentication via JSON key file
When to Use
-
You need to bridge Google Cloud Pub/Sub into Kafka for downstream processing.
-
You want to consume Pub/Sub messages in a Kafka-native pipeline.
When NOT to Use
-
You need to write Kafka records into Pub/Sub — use the Google Pub/Sub Sink Connector instead.
Installation
The connector is available from the GitHub Releases.
-
Navigate to the releases page and select the version matching your Kafka Connect installation.
-
Download the JAR file.
For installation steps, see Installing Connector Plugins.
Configuration
For the complete configuration reference, see the official source connector documentation.
| To configure a connector in Axual Self-Service, see Starting Connectors. TIP: For Infrastructure-as-Code deployment, see the Axual Kafka Connect Boilerplates for Terraform and Management API boilerplates. |
Getting Started
Prerequisites
Google Cloud project and service account
We’ll deploy a Google Pub/Sub instance using Google Cloud Services.
If you have a Google account, you can sign up for a free trial of Google cloud.
You can read more about Google Pub/Sub here.
-
Create a new service account. (You might get prompted to select your Google project first).
This service account can be used for both Google Pub/Sub and Google Pub/Sub Lite.-
At the top of the page, click +CREATE SERVICE ACCOUNT. A new page will open. Fill in the following:
-
Service account name:
my-pubsub-admin -
Service account ID:
my-pubsub-admin -
Service account description: "Temporary admin account to manage Google Pub/Sub"
-
-
Click CREATE AND CONTINUE
-
We’ll grant this ServiceAccount access to the project, and specifically two roles.
-
Click the dropdown called "Select a role" and, in the filter box, type
Pub/Sub Lite Admin. Click "Pub/Sub Lite Admin". -
Click +ADD ANOTHER ROLE. Repeat the previous step with
Pub/Sub Adminas a filter.
-
-
We are not granting users access to this service account, so click Done instead.
-
-
Click the newly created service account. A new page will open.
-
Click the KEYS tab.
-
Click the ADD KEY button and select "Create a new key". Use "JSON" Key type and click CREATE.
A key will be created and automatically downloaded on your machine. We’ll refer to it later as "the downloaded key-file".
You can now close the "Service accounts" tab.
Pub/Sub topic and subscription
-
Enable Pub/Sub for your project by visiting this page.
-
At the top of the page, click + CREATE TOPIC
-
Topic ID:
my_pubsub_google_topic -
Check the "Add a default subscription" box (should be checked by default)
-
Do not check the "schema" and "message retention" boxes
-
Encryption: Use a "Google-managed encryption key" (default)
-
Click CREATE TOPIC
A subscription calledmy_pubsub_google_topic-subis automatically created.
-
-
Click the MESSAGES tab, click PUBLISH MESSAGE, and publish at least one test message.
Axual stream
The stream where the connector will produce events must already exist in Axual Self-Service. See Creating streams if you need to create it.
Steps
Step 1 — Create a connector application
-
Follow the Creating streams documentation in order to create one stream and deploy it onto an environment.
The name of the stream will bemy_pubsub_kafka_topic.
The key/value types will beString/String. -
Follow the Configure and install a connector documentation to set up a new Connector-Application.
Let’s call itmy_pubsub_source.
The plugin name iscom.google.pubsub.kafka.source.CloudPubSubSourceConnector.
If a plugin isn’t available, ask a platform operator to install plugins.
Step 2 — Configure the connector
-
Open the previously downloaded GCP service account key file. Provide the following configuration:
cps.projectCopy the
project_idvalue from the downloaded key fileExample value:
decisive-lambda-369420gcp.credentials.jsonCopy the entire JSON as-is from the downloaded key file. Newlines will be automatically excluded.
cps.subscriptionmy_pubsub_google_topic-subkafka.topicmy_pubsub_kafka_topickafka.partition.schemeFor advanced options, see the official source connector documentation.
-
Authorize the
my_pubsub_sourcesource Connector-Application to produce to themy_pubsub_kafka_topicstream.
Step 4 — Verify
In Axual Self-Service, use stream-browse on the my_pubsub_kafka_topic stream to confirm events are arriving.
Note: the values will be base64 encoded.
Cleanup
When you are done:
-
Stop the connector application in Axual Self-Service.
-
Remove stream access for the application if no longer needed.
-
Return to your Pub/Sub topics list and delete your Google topics.
-
Delete the
my-pubsub-adminservice account. -
Check your IAM principals and remove any principals created as part of this example.
Examples
Minimal configuration
{
"name": "my-pubsub-source",
"config": {
"connector.class": "com.google.pubsub.kafka.source.CloudPubSubSourceConnector",
"cps.project": "decisive-lambda-369420",
"gcp.credentials.json": "<contents-of-gcp-service-account-key-json>",
"cps.subscription": "my_pubsub_google_topic-sub",
"kafka.topic": "my_pubsub_kafka_topic",
"kafka.partition.scheme": "kafka_partitioner"
}
}
License
Google Pub/Sub source and sink connectors are licensed under the Apache License, Version 2.0.