Google Pub/Sub Source Connector

Type

Source

Class

com.google.pubsub.kafka.source.CloudPubSubSourceConnector

Target System

Google Cloud Pub/Sub

Maintainer

Google

License

Apache License 2.0

Project

github.com/GoogleCloudPlatform/pubsub

Download

GitHub Releases

This page documents version 2.8.2. Newer versions should be compatible unless there are breaking changes, but field names or default values may differ. If you notice discrepancies, please contact Axual Support.

The GoogleCloudPlatform/pubsub repository used by this connector has been deprecated by Google since 2022. The successor project is googleapis/java-pubsub-group-kafka-connector. Migration involves updating the connector class names and download location. This documentation page has not yet been updated to reflect the new project.

Description

The Google Pub/Sub Source Connector reads messages from a Google Cloud Pub/Sub subscription and publishes them as records to a Kafka topic.

It is maintained by Google as part of the open-source github.com/GoogleCloudPlatform/pubsub.

Features

  • Consume messages from a Google Cloud Pub/Sub subscription and publish to Kafka

  • Configurable Kafka partition scheme

  • Supports GCP service account authentication via JSON key file

When to Use

  • You need to bridge Google Cloud Pub/Sub into Kafka for downstream processing.

  • You want to consume Pub/Sub messages in a Kafka-native pipeline.

When NOT to Use

Installation

The connector is available from the GitHub Releases.

  1. Navigate to the releases page and select the version matching your Kafka Connect installation.

  2. Download the JAR file.

For installation steps, see Installing Connector Plugins.

Configuration

For the complete configuration reference, see the official source connector documentation.

To configure a connector in Axual Self-Service, see Starting Connectors. TIP: For Infrastructure-as-Code deployment, see the Axual Kafka Connect Boilerplates for Terraform and Management API boilerplates.

Getting Started

Prerequisites

Google Cloud project and service account

We’ll deploy a Google Pub/Sub instance using Google Cloud Services.
If you have a Google account, you can sign up for a free trial of Google cloud.

You can read more about Google Pub/Sub here.

  1. Create a new service account. (You might get prompted to select your Google project first).
    This service account can be used for both Google Pub/Sub and Google Pub/Sub Lite.

    • At the top of the page, click +CREATE SERVICE ACCOUNT. A new page will open. Fill in the following:

      • Service account name: my-pubsub-admin

      • Service account ID: my-pubsub-admin

      • Service account description: "Temporary admin account to manage Google Pub/Sub"

    • Click CREATE AND CONTINUE

    • We’ll grant this ServiceAccount access to the project, and specifically two roles.

      • Click the dropdown called "Select a role" and, in the filter box, type Pub/Sub Lite Admin. Click "Pub/Sub Lite Admin".

      • Click +ADD ANOTHER ROLE. Repeat the previous step with Pub/Sub Admin as a filter.

    • We are not granting users access to this service account, so click Done instead.

  2. Click the newly created service account. A new page will open.

  3. Click the KEYS tab.

  4. Click the ADD KEY button and select "Create a new key". Use "JSON" Key type and click CREATE.
    A key will be created and automatically downloaded on your machine. We’ll refer to it later as "the downloaded key-file".
    You can now close the "Service accounts" tab.

Pub/Sub topic and subscription

  1. Enable Pub/Sub for your project by visiting this page.

  2. At the top of the page, click + CREATE TOPIC

    • Topic ID: my_pubsub_google_topic

    • Check the "Add a default subscription" box (should be checked by default)

    • Do not check the "schema" and "message retention" boxes

    • Encryption: Use a "Google-managed encryption key" (default)

    • Click CREATE TOPIC
      A subscription called my_pubsub_google_topic-sub is automatically created.

  3. Click the MESSAGES tab, click PUBLISH MESSAGE, and publish at least one test message.

Axual stream

The stream where the connector will produce events must already exist in Axual Self-Service. See Creating streams if you need to create it.

Steps

Step 1 — Create a connector application

  1. Follow the Creating streams documentation in order to create one stream and deploy it onto an environment.
    The name of the stream will be my_pubsub_kafka_topic.
    The key/value types will be String/String.

  2. Follow the Configure and install a connector documentation to set up a new Connector-Application.
    Let’s call it my_pubsub_source.
    The plugin name is com.google.pubsub.kafka.source.CloudPubSubSourceConnector.
    If a plugin isn’t available, ask a platform operator to install plugins.

Step 2 — Configure the connector

  1. Open the previously downloaded GCP service account key file. Provide the following configuration:

    cps.project

    Copy the project_id value from the downloaded key file

    Example value:
    decisive-lambda-369420

    gcp.credentials.json

    Copy the entire JSON as-is from the downloaded key file. Newlines will be automatically excluded.

    cps.subscription

    my_pubsub_google_topic-sub

    kafka.topic

    my_pubsub_kafka_topic

    kafka.partition.scheme

    For advanced options, see the official source connector documentation.

  2. Authorize the my_pubsub_source source Connector-Application to produce to the my_pubsub_kafka_topic stream.

Step 3 — Start the connector

Start the connector application from Axual Self-Service.

Step 4 — Verify

In Axual Self-Service, use stream-browse on the my_pubsub_kafka_topic stream to confirm events are arriving. Note: the values will be base64 encoded.

Cleanup

When you are done:

  1. Stop the connector application in Axual Self-Service.

  2. Remove stream access for the application if no longer needed.

  3. Return to your Pub/Sub topics list and delete your Google topics.

  4. Delete the my-pubsub-admin service account.

  5. Check your IAM principals and remove any principals created as part of this example.

Known limitations

  • Pub/Sub message values are delivered to Kafka as base64-encoded strings.

Examples

Minimal configuration

{
  "name": "my-pubsub-source",
  "config": {
    "connector.class": "com.google.pubsub.kafka.source.CloudPubSubSourceConnector",
    "cps.project": "decisive-lambda-369420",
    "gcp.credentials.json": "<contents-of-gcp-service-account-key-json>",
    "cps.subscription": "my_pubsub_google_topic-sub",
    "kafka.topic": "my_pubsub_kafka_topic",
    "kafka.partition.scheme": "kafka_partitioner"
  }
}

License

Google Pub/Sub source and sink connectors are licensed under the Apache License, Version 2.0.