If the built-in connectors don't cover your data sources or sinks, upload a custom connector JAR file to extend your Flink workspace. This page explains how to register, use, and update custom connectors.
Prerequisites
Before you begin, ensure that you have:
A Flink workspace in Realtime Compute for Apache Flink's Management Portal
A custom connector JAR file built according to Apache Flink standards
Usage notes
Review the following constraints before uploading your JAR file:
| Constraint | Details |
|---|---|
| Uniqueness | Each connector type can be registered only once. |
| Version alignment | The Flink version declared in your pom.xml must match the Flink runtime image version of your workspace. |
| `provided` scope | Add <scope>provided</scope> to Flink dependencies in your pom.xml. Uploading JAR files at the runtime layer causes dependency conflicts. |
| Shade packaging | Bundle all third-party dependencies (except Flink runtime dependencies) using the Maven Shade Plugin to avoid classpath conflicts at runtime. |
Register a custom connector
Log on to Realtime Compute for Apache Flink's Management Portal.
In the Actions column of the target workspace, click Console.
In the left navigation pane, click Connectors.
On the Custom Connectors tab, click Create Custom Connector.
Upload your JAR file using one of the following methods:
Upload File: Click Click to select and select your connector JAR file.
Use External URL: Enter the URL of the JAR file hosted on an external service. Example:
https://ossbucket/artifacts/namespaces/flink-default/flink-jobs-1.0-SNAPSHOT.jarNoteTwo types of external URLs are supported:
The OSS bucket associated with your Flink workspace. View the bucket details on the Details page in the Management Portal.
A URL for an external storage system that Realtime Compute for Apache Flink has access to (public-read or with granted permission).
Click Next. Flink parses the uploaded JAR file. If parsing succeeds, proceed to the next step. If parsing fails, confirm that your connector code complies with Apache Flink standards.
Click Finish. The custom connector appears in the connector list.
Use a custom connector in a job
Reference the connector in your job's SQL code using the connector option. The value of the connector option is the identifier parameter's value of DynamicTableFactory in your custom connector's JAR package. Other connector options are determined by your custom connector.
For details on writing job SQL, see Job development overview.
Update a custom connector
Updating a custom connector does not affect running jobs. Updates take effect after job restarts.
Log on to Realtime Compute for Apache Flink's Management Portal.
In the Actions column of the target workspace, click Console.
In the left navigation pane, click Connectors.
On the Custom Connectors tab, find the target connector and click Edit next to its name.
Upload the new JAR file using one of the following methods:
Upload File: Click Click to select and select your connector JAR file.
Use External URL: Enter the URL of the JAR file hosted on an external service. Example:
https://ossbucket/artifacts/namespaces/flink-default/flink-jobs-1.0-SNAPSHOT.jarNoteTwo types of external URLs are supported:
The OSS bucket associated with your Flink workspace. View the bucket details on the Details page in the Management Portal.
A URL for an external storage system that Realtime Compute for Apache Flink has access to (public-read or with granted permission).
Click Next. Flink parses the uploaded JAR file. If parsing succeeds, proceed to the next step. If parsing fails, confirm that your connector code complies with Apache Flink standards.
Click Finish.