This topic describes the limits on developing DataStream API jobs and the methods that are used to develop DataStream API jobs in fully managed Flink.


Services that are provided by fully managed Flink are subject to deployment and network environments. Therefore, when you develop DataStream API jobs in fully managed Flink, take note of the following limits:
  • You can publish and run jobs only in the JAR format.
  • You can use one main JAR package and multiple JAR dependencies.
  • You cannot read local configurations by using the main() method.
  • The parameters in the job code take precedence over the parameters that you configured in the console of fully managed Flink. To ensure that your job runs normally, we recommend that you configure checkpoint-related parameters in the console of fully managed Flink rather than in the job code.
  • Java Development Kit (JDK) 1.8 is used in the runtime environment. Therefore, you must also use JDK 1.8 to develop jobs.
  • Only open source Scala 2.11 is supported.


To prevent conflicts between JAR package dependencies, take note of the following points:
  • Make sure that the Flink version that you select on the Draft Editor page is the same as the Flink version that is specified in the dependency library of Apache Flink in the POM file. For more information about how to view the Flink version, see How do I query the engine version of Flink that is used by a job?.
  • Specify <scope>provided</scope> for Flink-related dependencies.
  • Use the Shade plug-in to package other third-party dependencies. For more information, see Apache Maven Shade plug-in.
  • Call only methods that are explicitly marked with @Public or @PublicEvolving in the source code of Apache Flink. Alibaba Cloud only ensures that Realtime Compute for Apache Flink is compatible with these methods.

For more information about how to handle Flink dependency conflicts, see How do I troubleshoot dependency conflicts of Flink?.

Develop a job

Before you publish jobs to clusters for running in the console of fully managed Flink, develop the jobs in your on-premises environment. When you write business code in fully managed Flink, see the following references:

Use a connector

The Ververica Runtime (VVR) connectors are placed in the Maven central repository for you to use when you develop a job. You can use a connector in one of the following ways:
  • (Recommended) Package the connector as a project dependency into the JAR file of your job.
    1. Add the following configurations to the POM file of the Maven project to reference SNAPSHOT repositories:
          <name>OSS Sonatype Snapshot Repository</name>
          <name>Apache Development Snapshot Repository</name>
    2. Check whether the <mirrorOf>*</mirrorOf> configuration is contained in your settings.xml configuration file.

      If the <mirrorOf>*</mirrorOf> configuration is contained in the configuration file, change the configuration to <mirrorOf>*,!,!apache.snapshots</mirrorOf>. This change prevents the two SNAPSHOT repositories that you configured in Step i from being overwritten. If only an asterisk (*) is enclosed in the mirrorOf element, the SNAPSHOT repositories are overwritten.

    3. Add the connector that you want to use to the Maven POM file as a project dependency. The following example shows the sample code:
      Different connector versions may correspond to different connector types. We recommend that you use the latest version for the type of the connector that you use.
      • You must search for the connector versions that contain the SNAPSHOT keyword in the SNAPSHOT repository You cannot find the versions in the Maven central repository
      • If you use multiple connectors, you must merge the files in the META-INF directory. To merge the files, add the following code to the POM file:
            <!-- The service transformer is needed to merge META-INF/services files -->
            <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
            <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheNoticeResourceTransformer">
                <projectName>Apache Flink</projectName>
  • After you upload the JAR package of the connector to the console of fully managed Flink, enter the configuration information.
    1. Log on to the Realtime Compute for Apache Flink console.
    2. On the Fully Managed Flink tab, find the workspace that you want to manage and click Console in the Actions column.
    3. In the left-side navigation pane, click Artifacts.
    4. Click Upload Artifact and select the JAR package that you want to upload.

      You can upload the JAR package of your self-managed connector or the JAR package of a connector that is provided by fully managed Flink.

    5. In the Additional Dependencies section of the Draft Editor page, select the JAR package that you want to use.


Quick start of a Flink JAR job