All Products
Search
Document Center

ApsaraMQ for Kafka:Manage connectors

Last Updated:Mar 10, 2026

ApsaraMQ for Confluent integrates with Confluent Connectors to stream data between Apache Kafka and external systems. A source connector ingests data from an external system -- such as a database or message queue -- into Kafka topics. A sink connector delivers data from Kafka topics to an external system -- such as a cloud storage service or data warehouse. Some connectors support both directions.

This topic describes how to add and remove connectors and lists all available connectors by category.

Prerequisites

Before you begin, make sure that you have:

  • An ApsaraMQ for Confluent instance (Professional Edition or Enterprise Edition)

  • The Connect component purchased and added to your instance. To add this component, upgrade the instance configuration

Add a connector

Important

Adding a connector stops all running jobs and restarts the Connect service. Finish all running jobs before you proceed, or use the force update option if available.

Note

Enterprise Edition supports custom connectors that are not listed in the connector catalog. To install a custom connector, prepare your connector package and submit a ticket.

  1. Log on to the ApsaraMQ for Confluent console. In the left-side navigation pane, click Instances.

  2. In the top navigation bar, select the region where your instance resides. Find the target instance and click its name.

  3. In the left-side navigation pane, click Connectors. Then, click Add Connector.

  4. In the Add Connector panel, search for and select the connector. Click install icon, and then click OK.

    Add connector panel

  5. Verify the deployment in Control Center. The connector becomes available in Control Center after installation completes in the ApsaraMQ for Confluent console.

    1. Log on to the Control Center console and go to the Cluster overview page.

    2. In the left-side navigation pane, click Connect.

    3. In the Cluster name column, select connect.

    4. Click Add connector to view the deployed connector.

    Control Center connector view

Remove a connector

  1. In the ApsaraMQ for Confluent console, go to the Connectors page.

  2. Select the connector to remove and click Remove.

  3. In the Note message, click OK.

Connector categories

Connectors fall into three categories. The category determines licensing and availability based on your instance edition.

CategoryProfessional EditionEnterprise EditionDescription
Open Source ConnectorsSupportedSupportedOpen source connectors from vendors such as Debezium, MongoDB, Snowflake, and Confluent
Commercial ConnectorsNot supportedSupportedConfluent-built connectors for enterprise integrations including AWS, Azure, and GCP services
Premium ConnectorsNot supportedSupportedSpecialized Confluent connectors for advanced use cases such as Oracle CDC

Open Source Connectors

The following 44 connectors are available to both Professional Edition and Enterprise Edition instances.

Connector nameOwnerType
debezium-connector-mongodbdebeziumsource
debezium-connector-mysqldebeziumsource
debezium-connector-postgresqldebeziumsource
debezium-connector-sqlserverdebeziumsource
diffusion-connectorpushsource, sink
gridgain-kafka-connectgridgainsource, sink
ignite-connectorgridgainsource, sink
kafka-connect-ablyablysink
kafka-connect-aerospike-sinkaerospikesink
kafka-connect-as400infoviewsystemssource, sink
kafka-connect-bigquerywepaysink
kafka-connect-cassandra-sinkdatastaxsink
kafka-connect-cosmosmicrosoftcorporationsource, sink
kafka-connect-couchbasecouchbasetransform, source, sink
kafka-connect-cruxjuxtsource, sink
kafka-connect-datagenconfluentincsource
kafka-connect-elasticsearchconfluentincsink
kafka-connect-geodeapachesource, sink
kafka-connect-hdfsconfluentincsink
kafka-connect-hec-sinkhumiosink
kafka-connect-jdbcconfluentincsource, sink
kafka-connect-jdbc_flattennorsktippingsource
kafka-connect-logsdatadogsink
kafka-connect-mongodbmongodbsource
kafka-connect-neo4jneo4jsource
kafka-connect-odatabuseventinitsource
kafka-connect-odpinitsource
kafka-connect-redisjcustenbordersink
kafka-connect-rocksetrocksetsink
kafka-connect-s3confluentincsink
kafka-connect-splunksplunksink
kafka-connect-spooldirjcustenbordersource
kafka-connect-tigergraphexperoincsource, sink
kafka-connect-venafiopencredosource
kafka-sink-azure-kustomicrosoftcorporationsink
kinetica-connectorkineticasource, sink
newrelic-kafka-connectornewrelicsink
oracdc-kafkaa2solutionssource, sink
privitar-kafka-connectorprivitarsink
scylla-cdc-source-connectorscylladbsource
singlestore-kafka-connectorsinglestoresink
snowflake-kafka-connectorsnowflakeincsink
streaming-connect-sinkadobeincsink
yb-kafka-connectoryugabyteincsink

Commercial Connectors

The following 72 connectors are available only to Enterprise Edition instances. All Commercial Connectors are owned by confluentinc.

Connector nameType
kafka-connect-activemqsource
kafka-connect-activemq-sinksink
kafka-connect-ampssource
kafka-connect-appdynamics-metricssink
kafka-connect-aws-cloudwatch-logssource
kafka-connect-aws-cloudwatch-metricssink
kafka-connect-aws-dynamodbsink
kafka-connect-aws-lambdasink
kafka-connect-aws-redshiftsink
kafka-connect-azure-blob-storagesink
kafka-connect-azure-blob-storage-sourcesource
kafka-connect-azure-data-lake-gen1-storagesink
kafka-connect-azure-data-lake-gen2-storagesink
kafka-connect-azure-event-hubssource
kafka-connect-azure-functionssink
kafka-connect-azure-searchsink
kafka-connect-azure-service-bussource
kafka-connect-azure-sql-dwsink
kafka-connect-cassandrasink
kafka-connect-data-diodesource, sink
kafka-connect-databricks-delta-lakesink
kafka-connect-datadog-metricssink
kafka-connect-firebasesource, sink
kafka-connect-ftpssource, sink
kafka-connect-gcp-bigtablesink
kafka-connect-gcp-dataproc-sinksink
kafka-connect-gcp-functionssink
kafka-connect-gcp-pubsubsource
kafka-connect-gcp-spannersink
kafka-connect-gcssink
kafka-connect-gcs-sourcesource
kafka-connect-githubsource
kafka-connect-hbasesink
kafka-connect-hdfs2-sourcesource
kafka-connect-hdfs3sink
kafka-connect-hdfs3-sourcesource
kafka-connect-httpsink
kafka-connect-ibmmqsource
kafka-connect-ibmmq-sinksink
kafka-connect-influxdbsource, sink
kafka-connect-jirasource
kafka-connect-jmssource
kafka-connect-jms-sinksink
kafka-connect-kinesissource
kafka-connect-kudusource, sink
kafka-connect-maprdbsink
kafka-connect-marketosource
kafka-connect-mqttsource, sink
kafka-connect-netezzasink
kafka-connect-omniscisink
kafka-connect-pagerdutysink
kafka-connect-pivotal-gemfiresink
kafka-connect-prometheus-metricssink
kafka-connect-rabbitmqtransform, source
kafka-connect-rabbitmq-sinksink
kafka-connect-s3-sourcesource
kafka-connect-salesforcesource, sink
kafka-connect-salesforce-bulk-apisource, sink
kafka-connect-servicenowsource, sink
kafka-connect-sftpsource, sink
kafka-connect-snmpsource
kafka-connect-solace-sinksink
kafka-connect-solace-sourcesource
kafka-connect-splunk-sourcesource
kafka-connect-sqssource
kafka-connect-syslogsource
kafka-connect-teradatasource, sink
kafka-connect-tibco-sinksink
kafka-connect-tibco-sourcesource
kafka-connect-verticasink
kafka-connect-weblogicsource
kafka-connect-zendesksource

Premium Connectors

The following 2 connectors are available only to Enterprise Edition instances. Both are owned by confluentinc.

Connector nameType
kafka-connect-oracle-cdcsource
kafka-connect-splunk-s2ssource