Guides
Connectors
Connect external systems to Kafka
Overview
Kafka Connect is a framework for connecting Kafka with external systems such as databases, key-value stores, search indexes, and file systems. Connectors handle the integration logic, allowing you to move data in and out of Kafka without writing custom code.
The Self-Service API supports creating and managing fully-managed connectors in Confluent Cloud.
Confluent Documentation
Connector Naming Convention
Connector names are automatically generated based on your cluster type:
Dedicated Clusters
{business_unit}-{stage}-{id}-connector
Example: sales-dev-analytics-snowflake-connector
Shared Clusters
{landing_zone}-{business_unit}-{stage}-{id}-connector
Example: edh-shared-scada-dev-analytics-snowflake-connector
landing_zone: Shared cluster landing zone identifier (e.g., "edh-shared")
business_unit: Your organization's identifier (e.g., "scada", "sales")
stage: Environment stage (dev, qas, or run)
id: Descriptive name for your connector (e.g., "analytics-snowflake", "crm-http")
Note
-connector suffix.
Connector Types
Source Connectors
Source connectors ingest data from external systems into Kafka topics. They continuously poll the source system for new data and publish it to Kafka.
Sink Connectors
Sink connectors export data from Kafka topics to external systems. They consume messages from Kafka and write them to the destination.
Available Connectors
The Self-Service API supports the following connector types:
Source Connectors
Snowflake Source
Stream data from Snowflake tables into Kafka.
HTTP Source V2
Pull data from HTTP/REST APIs into Kafka.
Azure Event Hubs Source
Stream events from Azure Event Hubs into Kafka.
Datagen Source
Generate sample data for testing and development.
Amazon S3 Source
Import data from Amazon S3 buckets into Kafka topics.
Sink Connectors
Snowflake Sink
Load Kafka data into Snowflake tables.
HTTP Sink V2
Push Kafka messages to HTTP/REST endpoints.
AWS Credentials
The Amazon S3 Source connector requires AWS IAM credentials to access your S3 buckets.
AWS Access Key ID and AWS Secret Access Key — standard IAM user credentials with appropriate S3 permissions.
Required IAM Permissions
The IAM user must have the following permissions on the target S3 bucket:
- S3 Source:
s3:GetObject,s3:GetBucketLocation,s3:ListBucket
Related Documentation
Creating Connectors
To create a connector:
- Select Type: Choose the connector type that matches your integration needs
- Configure Connection: Provide credentials and connection details for the external system
- Select Topics: Specify which topics to read from (sink) or write to (source)
- Set Transforms: Optionally configure Single Message Transforms (SMTs)
- Deploy: The connector will be provisioned in Confluent Cloud
Connector States
Connectors can be in the following states:
Best Practices
- Use descriptive connector names that indicate the source/destination
- Store sensitive credentials securely (API keys, passwords)
- Monitor connector health and set up alerts for failures
- Test connectors in dev/qas before deploying to production
- Use Dead Letter Queues (DLQ) to handle problematic messages
- Configure appropriate retry policies for transient failures