All Products
Search
Document Center

Realtime Compute for Apache Flink:Manage custom connectors

Last Updated:Aug 17, 2023

This topic describes how to manage custom connectors in the console of fully managed Flink. For example, you can create, update, and delete a custom connector.

Background information

When you use Flink SQL to develop drafts, you must use SQL connectors to access your source tables, result tables, and dimension tables. Fully managed Flink supports multiple connector types that are commonly used on the cloud. However, these connector types are a small portion of the commonly used big data technology stacks. To support more connector types, fully managed Flink allows you to develop custom connectors. To use a custom connector, you must upload the JAR file of the connector.

Precautions

  • To allow fully managed Flink to identify and use your custom connector, you must develop your custom connector based on the connector standards that are defined by the Apache Flink community. Therefore, you must obtain the .meta file of the custom connector that you want to develop and declare the factory class. For more information about how to develop custom connectors, see User-defined Sources & Sinks.

  • You can upload the JAR file of connectors of the same type only once.

  • To avoid JAR file dependency conflicts, take note of the following points:

    • Make sure that the version of the Flink image is the same as the Flink version in the Project Object Model (POM) dependencies.

    • Do not upload JAR files at the runtime layer. This indicates that you need to add <scope>provided</scope> to the dependencies.

    • Use the Shade plug-in to package third-party dependencies. For more information, see Apache Maven Shade plug-in.

  • If a deployment uses a custom connector and the custom connector is updated when the deployment is running, the deployment still uses the original custom connector. If you restart the deployment, the deployment uses the new connector.

Create and use a custom connector

  1. Go to the Create custom connector dialog box.

    1. Log on to the Realtime Compute for Apache Flink console.

    2. On the Fully Managed Flink tab, find the workspace that you want to manage and click Console in the Actions column.

    3. In the left-side navigation pane, click Connectors.

  2. Create a custom connector.

    1. On the Connectors page, click Create Custom Connector.

    2. Upload the JAR file of the custom connector that you want to create.

      You can use one of the following methods to upload the JAR file of a custom connector:

      • Upload File: Click Click to select and select the desired JAR file.

      • Use External URL: Enter an external URL. If the size of the JAR file exceeds 200 MB or you want to use a JAR file that is used by another service, you can use the external URL to obtain the JAR file. For example, you can enter https://ossbucket/artifacts/namespaces/flink-default/flink-jobs-1.0-SNAPSHOT.jar.

        Note
        • If the size of the JAR file exceeds 200 MB, you can save the file of the custom connector in the sql-artifacts/namespaces/{namespace} directory of the Object Storage Service (OSS) bucket that is associated with fully managed Flink. Then, use the HTTPS path of the file as the external URL.

        • You can use the HTTPS path of another service as the external URL. In this case, the service and Realtime Compute for Apache Flink must reside in the same virtual private cloud (VPC). You can also establish a network connection between the service and Realtime Compute for Apache Flink over the Internet and then use the public endpoint of Realtime Compute for Apache Flink. For more information, see How does the fully managed Flink service access the Internet?

    3. After you upload the JAR file, click Next.

      The system parses the content of the JAR file that you uploaded. If file parsing is successful, proceed to the next step. If file parsing fails, check whether the code of your custom connector complies with the standards that are defined by the Apache Flink community.

    4. Click Finish.

      The custom connector that you create appears in the connector list.

  3. Use the connector in the DDL statement of your draft.

    For more information about draft development, see Develop an SQL draft.

    Note

    The value of the connector parameter in the WITH clause in the DDL statement is the value of the identifier parameter of DynamicTableFactory in the JAR file of the custom connector. Other parameters in the WITH clause and their definitions vary based on the custom connector that you create.

Update a custom connector

  1. Go to the Edit connector dialog box.

    1. Log on to the Realtime Compute for Apache Flink console.

    2. On the Fully Managed Flink tab, find the workspace that you want to manage and click Console in the Actions column.

    3. In the left-side navigation pane, click Connectors.

  2. On the Connectors tab, click the Custom Connectors tab. On the Custom Connectors tab, find the desired custom connector and click Edit to the right of the name of the custom connector.

  3. Upload the JAR file of the custom connector.

    You can use one of the following methods to upload the JAR file of a custom connector:

    • Upload File: Click Click to Select and select the desired JAR file.

    • Use External URL: Enter an external URL. For example, you can enter https://ossbucket/artifacts/namespaces/flink-default/flink-jobs-1.0-SNAPSHOT.jar.

      Note
      • If the external URL is the endpoint of an OSS bucket, the JAR file of the custom connector must be stored in the sql-artifacts/namespaces/{namespace} directory.

      • If the JAR file is stored in a service other than OSS and is not in the same VPC as fully managed Flink, enter the public endpoint. Before you use the public endpoint, you must establish a connection between fully managed Flink and the destination service. For more information, see How does the fully managed Flink service access the Internet?.

  4. After you upload the JAR file, click Next.

    The system parses the content of the JAR file that you uploaded. If file parsing is successful, proceed to the next step. If file parsing fails, check whether the code of your custom connector complies with the standards that are defined by the Apache Flink community.

  5. Click Finish.

Delete a custom connector

If you no longer use a custom connector, you can perform the following steps to delete the custom connector.

  1. Go to the page on which you can delete a custom connector.

    1. Log on to the Realtime Compute for Apache Flink console.

    2. On the Fully Managed Flink tab, find the workspace that you want to manage and click Console in the Actions column.

    3. In the left-side navigation pane, click Connectors.

  2. On the Connectors tab, click the Custom Connectors tab. On the Custom Connectors tab, find the desired custom connector and click Delete to the right of the name of the custom connector.

  3. In the message that appears, click Confirm.