You can call the CreateFlowJob operation to create a job.


OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer. You can use OpenAPI Explorer to search for API operations, call API operations, and dynamically generate SDK sample code.

Request parameters

Parameter Type Required Example Description
Action String Yes CreateFlowJob

The operation that you want to perform. This parameter is required for API requests that you create by piecing together HTTP or HTTPS URLs. Set the value to CreateFlowJob.

Description String Yes This is the description of a job

The description of the job.

Name String Yes my_shell_job

The name of the job.

ProjectId String Yes FP-257A173659F5****

The ID of the project.

RegionId String Yes cn-hangzhou

The region ID of the instance.

ResourceList.N.Path String Yes oss://path/demo.jar

The path of the resource. OSS and HDFS .

Type String Yes SHELL

The type of the job. Currently, the following job types are supported:

  • MR
  • HIVE
  • PIG
FailAct String No CONTINUE

The action to take upon an operation failure of the node instance. Valid values:

  • CONTINUE: skips the node instance
  • STOP: stops the workflow instance
MaxRetry Integer No 5

The maximum number of retries. Set this parameter to a value from 0 to 5.

RetryPolicy String No N/A

Retry policy and retain the parameters.

MaxRunningTimeSec Long No 0

A reserved parameter.

RetryInterval Long An array of node roles. 200

The interval between retries is from 0 to 300, in seconds.

Params String No ls -l

The content of the job.

ParamConf String No {"date":"${yyyy-MM-dd}"}

The configuration parameters of the job.

CustomVariables String No {\"scope\":\"PROJECT\",\"entityId\":\"FP-80C2FDDBF35D9CC5\",\"variables\":[{\"name\":\"v1\",\"value\":\"1\",\"properties\":{\"password\":true}}]}

The custom variables configured for the job.

EnvConf String No {"key":"value"}

The environment variables configured for the job.

RunConf String No {"priority":1,"userName":"hadoop","memory":2048,"cores":1}

The scheduling parameters configured for the job.

  • priority: the priority of the job.
  • userName: the name of the Linux user who submits the job.
  • memory: the memory allocated to the job. Unit: MB.
  • cores: the number of vCPUs allocated to the job.
MonitorConf String No {"inputs":[{"type":"KAFKA","clusterId":"C-1234567","topics":"kafka_topic","":"kafka_consumer_group"}],"outputs":[{"type":"KAFKA","clusterId":"C-1234567","topics":"kafka_topic"}]}

Monitoring configuration, only SPARK_STREAMING Type job support.

Mode String No YARN

The mode of the model. Valid values:

  • YARN: submits the job from a worker node
  • LOCAL: submits the job from a header or gateway node
ParentCategory String No FC-5BD9575E3462****

The parent directory ID of the job.

ResourceList.N.Alias String No demo.jar

The alias of the resource.

Adhoc Boolean No false

Indicates whether the job is a temporary query job.

ClusterId String No C-A23BD131A862****

The ID of the cluster.

AlertConf String No N/A

A reserved parameter.

Response parameters

Parameter Type Example Description
Id String FJ-A23BD131A862****

The ID of the job.

RequestId String 1549175a-6d14-4c8a-89f9-5e28300f6d7e

The ID of the request.


Sample requests

&Description=This is a data development job Description
&<common request parameters>

Sample success responses

XML format


JSON Syntax

    "RequestId": "2670BCFB-925D-4C3E-9994-8D12F7A9F538",
    "Id": "FJ-BBCAE48B90CC****"