If built-in connectors do not meet your needs, you can use custom connectors. This topic describes how to upload, use, and update a custom connector.
Precautions
For Realtime Compute for Apache Flink to correctly detect and use your custom connector, you must develop the connector according to the standards defined by the Flink community. Therefore, you must specify the connector's meta file and declare its Factory class. For more information about how to develop custom connectors, see User-defined Sources & Sinks.
You can upload a connector of a specific type only once.
To avoid JAR package dependency conflicts, note the following:
Keep the Flink runtime image version consistent with the Flink version specified in the Project Object Model (POM) dependencies.
Do not upload JAR packages at the runtime layer. This means you must add
<scope>provided</scope>to the dependencies.Package other third-party dependencies using the Shade method. For more information about Shade packaging, see Apache Maven Shade Plugin.
Upload and use a custom connector
Go to the custom connector registration page.
Log on to the Realtime Compute for Apache Flink console.
In the Actions column of the target workspace, click Console.
In the navigation pane on the left, click Connectors.
Register the custom connector.
On the Connectors page, click Create Custom Connector.
Upload the custom connector JAR file.
You can upload the custom connector JAR file in one of the following ways:
Upload File: Click Select File and choose the target connector JAR file.
External URL: To use a JAR file from another service, you can use the external URL feature to retrieve the file. For example, https://ossbucket/artifacts/namespaces/flink-default/flink-jobs-1.0-SNAPSHOT.jar
NoteOnly the following two types of external URLs are supported:
A URL for the OSS bucket that you selected when you created the Flink workspace. You can view the attached OSS bucket on the Workspace Details page in the Realtime Compute for Apache Flink console.
A URL for another external storage system that Realtime Compute for Apache Flink has permission to access (public-read or granted permission).
After the upload is complete, click Next.
The system parses the content of the custom connector that you uploaded. If the parsing is successful, you can proceed to the next step. If the parsing fails, confirm that the code of your custom connector complies with Flink community standards.
Click Finish.
The created custom connector appears in the connector list.
You can use the connector in the job DDL.
For more information about job development, see Job development map.
NoteThe value of the connector parameter in the WITH clause is the value of the identifier parameter of DynamicTableFactory in your custom connector's JAR package. Other WITH parameters and their meanings are determined by the custom connector that you developed.
Update a custom connector
Updating a custom connector does not affect running jobs that use it. The jobs use the updated connector after they are restarted.
Go to the custom connector update page.
Log on to the Realtime Compute for Apache Flink console.
In the Actions column of the target workspace, click Console.
In the navigation pane on the left, click Connectors.
On the Custom Connectors tab, find the target custom connector and click Edit next to its name.
Upload the custom connector JAR file.
You can upload the custom connector JAR file in one of the following ways:
Upload File: Click Select File and choose the target connector JAR file.
External URL: Enter an external URL. For example, https://ossbucket/artifacts/namespaces/flink-default/flink-jobs-1.0-SNAPSHOT.jar
NoteOnly the following two types of external URLs are supported:
A URL for the OSS bucket that you selected when you created the Flink workspace. You can view the attached OSS bucket on the Workspace Details page in the Realtime Compute for Apache Flink console.
A URL for another external storage system that Realtime Compute for Apache Flink has permission to access (public-read or granted permission).
After the upload is complete, click Next.
The system parses the content of the custom connector that you uploaded. If the parsing is successful, you can proceed to the next step. If the parsing fails, confirm that the code of your custom connector complies with Flink community standards.
Click Finish.
References
Flink provides a variety of built-in connectors. For more information about supported built-in connectors, see Supported connectors.
For more information about using community edition Change Data Capture (CDC) connectors, see Use community edition CDC connectors.
Flink provides data management by allowing you to create a catalog to manage and access the associated metadata. For more information, see Data Management.