All Products
Search
Document Center

Realtime Compute for Apache Flink:Manage custom connectors

Last Updated:Apr 10, 2024

If built-in connectors cannot meet your business requirements, you can use custom connectors. This topic describes how to upload, use, and update a custom connector.

Precautions

  • To allow Realtime Compute for Apache Flink to identify and use your custom connector, you must develop the custom connector based on the connector standards that are defined by the Apache Flink community. Therefore, you must obtain the metadata file of the custom connector that you want to develop and declare the factory class of the custom connector. For more information about how to develop custom connectors, see User-defined Sources & Sinks.

  • You can upload the JAR file of connectors of the same type only once.

  • To avoid JAR file dependency conflicts, take note of the following points:

    • Make sure that the version of the Flink image is the same as the Flink version in the Project Object Model (POM) dependencies.

    • Do not upload JAR files at the runtime layer. This indicates that you need to add <scope>provided</scope> to the dependencies.

    • Use the Shade plug-in to package third-party dependencies. For more information, see Apache Maven Shade plug-in.

Upload and use a custom connector

  1. Go to the Create custom connector dialog box.

    1. Log on to the Realtime Compute for Apache Flink console.

    2. Find the workspace that you want to manage and click Console in the Actions column.

    3. In the left-side navigation pane, click Connectors.

  2. Create a custom connector.

    1. On the Connectors page, click Create Custom Connector.

    2. Upload the JAR file of the custom connector that you want to create.

      You can use one of the following methods to upload the JAR file of a custom connector:

      • Upload File: Click Click to select and select the desired JAR file.

      • Use External URL: Enter an external URL of another service. If you want to use a JAR file of another service, you can use this method to obtain the JAR file. For example, you can enter https://ossbucket/artifacts/namespaces/flink-default/flink-jobs-1.0-SNAPSHOT.jar.

        Note

        If a service and the Realtime Compute for Apache Flink workspace reside in the same virtual private cloud (VPC), you can enter the HTTP path of the service as the external URL. Alternatively, you can establish a network connection between the service and the Realtime Compute for Apache Flink workspace over the Internet and then enter the public endpoint of the service. For more information, see How does Realtime Compute for Apache Flink access the Internet?

    3. After you upload the JAR file, click Next.

      The system parses the content of the JAR file that you uploaded. If file parsing is successful, proceed to the next step. If file parsing fails, check whether the code of your custom connector complies with the standards that are defined by the Apache Flink community.

    4. Click Finish.

      The custom connector that you create appears in the connector list.

  3. Use the connector in the DDL statement of your draft.

    For more information about draft development, see Develop an SQL draft.

    Note

    You must set the connector parameter in the WITH clause to the value of the identifier parameter of DynamicTableFactory in the JAR file of the custom connector. Other parameters in the WITH clause and their definitions vary based on the custom connector that you create.

Update a custom connector

Note

After you update a custom connector, running deployments that use the connector are not affected. After the deployments are restarted, the connector update applies to the deployments.

  1. Go to the Edit connector dialog box.

    1. Log on to the Realtime Compute for Apache Flink console.

    2. Find the workspace that you want to manage and click Console in the Actions column.

    3. In the left-side navigation pane, click Connectors.

  2. On the Connectors tab, click the Custom Connectors tab. On the Custom Connectors tab, find the desired custom connector and click Edit to the right of the name of the custom connector.

  3. Upload the JAR file of the custom connector.

    You can use one of the following methods to upload the JAR file of a custom connector:

    • Upload File: Click Click to Select and select the desired JAR file.

    • Use External URL: Enter an external URL. For example, you can enter https://ossbucket/artifacts/namespaces/flink-default/flink-jobs-1.0-SNAPSHOT.jar.

      Note
      • If the external URL is the endpoint of an OSS bucket, the JAR file of the custom connector must be stored in the sql-artifacts/namespaces/{namespace} directory.

      • If the JAR file is stored in a service other than OSS and is not in the same VPC as the Realtime Compute for Apache Flink workspace, enter the public endpoint of the service. You must establish a network connection between the Realtime Compute for Apache Flink workspace and the service over the Internet. For more information, see How does Realtime Compute for Apache Flink access the Internet?

  4. After you upload the JAR file, click Next.

    The system parses the content of the JAR file that you uploaded. If file parsing is successful, proceed to the next step. If file parsing fails, check whether the code of your custom connector complies with the standards that are defined by the Apache Flink community.

  5. Click Finish.

References

  • Realtime Compute for Apache Flink provides various built-in connectors. For more information about the built-in connectors supported by Realtime Compute for Apache Flink, see Supported connectors.

  • For more information about how to use the Change Data Capture (CDC) connectors for Apache Flink, see Use a CDC connector for Apache Flink.

  • Realtime Compute for Apache Flink supports metadata management. You can create a catalog to manage and access metadata. For more information, see Manage catalogs.