This topic describes how to manage user-defined functions (UDFs) in fully managed
Flink. For example, you can register, update, and delete a UDF.
Precautions
To prevent conflicts between JAR package dependencies, take note of the following
points when you develop UDFs:
- Make sure that the Flink version that you select on the Draft Editor page is the same
as the Flink version in the POM dependency.
- Specify
<scope>provided</scope>
for Flink-related dependencies.
- Use the Shade plug-in to package other third-party dependencies. For more information,
see Apache Maven Shade plug-in.
For more information about how to handle Flink dependency conflicts between JAR package
dependencies, see How do I troubleshoot dependency conflicts of Flink?.
Register a UDF
Before you can use a UDF in SQL statements, register the UDF. Only registered UDFs
can be used in SQL statements.
- Log on to the Realtime Compute for Apache Flink console.
- On the Fully Managed Flink tab, find the workspace that you want to manage and click Console in the Actions column.
- In the left-side navigation pane, click Draft Editor.
- Click the UDFs tab.
- In the upper-left corner of the page, click the
icon.
- Upload a UDF JAR file.

You can use one of the following methods to upload a UDF JAR file:
- Upload a file: Click click to select next to Select a file to upload the UDF Artifact file that you want to upload. If you want to upload a
dependency file, click click to select next to Dependencies to upload the file that your UDF Artifact file depends on.
Note
- Your UDF JAR file is uploaded and stored in the sql-artifacts directory of the Object Storage Service (OSS) bucket that you select. Fully managed
Flink parses the UDF JAR file and checks whether the classes of the Flink user-defined
scalar function (UDF), user-defined aggregate function (UDAF), and user-defined table-valued
function (UDTF) interfaces are used in the file. Then, fully managed Flink automatically
extracts the class names and specifies the class names in the Function Name field.
- The dependencies of Java UDFs can be added to UDF JAR packages or uploaded by using
a dependency file. We recommend that you upload the dependencies of Python UDFs by
separately uploading the dependency file.
- External URL: Enter an external URL.
Note
- The external URL must be an OSS path. If you want to access fully managed Flink over
the Internet, you must connect fully managed Flink to the Internet. For more information,
see How does a fully managed Flink cluster access the Internet?.
- If the size of the UDF Artifact file or the size of the UDF Artifact dependency file
is large, we recommend that you upload the file by using an external URL. If the external
URL is the endpoint of an OSS bucket, the dependency file of the UDF must be stored
in the sql-artifacts/namespaces/{namespace} directory.
- Click Ok.
In the UDFs list on the left side of the SQL Editor page, you can view all the UDFs that are
registered.
Update a UDF
If you add a UDF to a UDF JAR file or change the code of a registered UDF in the file,
you can perform the following operations to update the UDF JAR file:
- Log on to the Realtime Compute for Apache Flink console.
- On the Fully Managed Flink tab, find the workspace that you want to manage and click Console in the Actions column.
- In the left-side navigation pane, click Draft Editor.
- Click the UDFs tab.
- In the UDFs list, move the pointer over the name of the UDF JAR file that you want to update
and click the
icon.
- Upload a UDF JAR file.

You can use one of the following methods to upload a UDF JAR file:
- Upload a file: Click click to select next to Select a file to upload the UDF Artifact file that you want to upload. If you want to upload a
dependency file, click click to select next to Dependencies to upload the file that your UDF Artifact file depends on.
- External URL: Enter an external URL.
Notice
- The UDF JAR file that you upload must contain all the classes of the registered UDFs.
- The code in the new UDF JAR file takes effect only when you restart the job or publish
a new job. The code in the new UDF JAR file does not take effect on jobs that are
running. The jobs that are running continue to use the original UDF JAR file.
- Click Update.
Delete a UDF
If you no longer need a UDF JAR file, perform the following operations to delete the
UDF JAR file:
- Log on to the Realtime Compute for Apache Flink console.
- On the Fully Managed Flink tab, find the workspace that you want to manage and click Console in the Actions column.
- In the left-side navigation pane, click Draft Editor.
- Click the UDFs tab.
- In the UDFs list, move the pointer over the name of the UDF JAR file that you want to delete
and click the
icon.
Note Before you delete a UDF JAR file, make sure that the UDF that is registered by the
UDF JAR file is not referenced by a job or an SQL file.
- Select Unregister functions and delete associated files.
If you want to delete the UDF JAR file, you must delete all the registered UDFs from
the file to avoid dirty data.
- Click Ok.