You can call the CreateFlowJob operation to create a job.
Debugging
Request parameters
Parameter | Type | Required | Example | Description |
---|---|---|---|---|
Action | String | Yes | CreateFlowJob |
The operation that you want to perform. This parameter is required for API requests that you create by piecing together HTTP or HTTPS URLs. Set the value to CreateFlowJob. |
Description | String | Yes | This is the description of a job |
The description of the job. |
Name | String | Yes | my_shell_job |
The name of the job. |
ProjectId | String | Yes | FP-257A173659F5**** |
The ID of the project. |
RegionId | String | Yes | cn-hangzhou |
The region ID of the instance. |
ResourceList.N.Path | String | Yes | oss://path/demo.jar |
The path of the resource. OSS and HDFS . |
Type | String | Yes | SHELL |
The type of the job. Currently, the following job types are supported:
|
FailAct | String | No | CONTINUE |
The action to take upon an operation failure of the node instance. Valid values:
|
MaxRetry | Integer | No | 5 |
The maximum number of retries. Set this parameter to a value from 0 to 5. |
RetryPolicy | String | No | N/A |
Retry policy and retain the parameters. |
MaxRunningTimeSec | Long | No | 0 |
A reserved parameter. |
RetryInterval | Long | An array of node roles. | 200 |
The interval between retries is from 0 to 300, in seconds. |
Params | String | No | ls -l |
The content of the job. |
ParamConf | String | No | {"date":"${yyyy-MM-dd}"} |
The configuration parameters of the job. |
CustomVariables | String | No | {\"scope\":\"PROJECT\",\"entityId\":\"FP-80C2FDDBF35D9CC5\",\"variables\":[{\"name\":\"v1\",\"value\":\"1\",\"properties\":{\"password\":true}}]} |
The custom variables configured for the job. |
EnvConf | String | No | {"key":"value"} |
The environment variables configured for the job. |
RunConf | String | No | {"priority":1,"userName":"hadoop","memory":2048,"cores":1} |
The scheduling parameters configured for the job.
|
MonitorConf | String | No | {"inputs":[{"type":"KAFKA","clusterId":"C-1234567","topics":"kafka_topic","consumer.group":"kafka_consumer_group"}],"outputs":[{"type":"KAFKA","clusterId":"C-1234567","topics":"kafka_topic"}]} |
Monitoring configuration, only SPARK_STREAMING Type job support. |
Mode | String | No | YARN |
The mode of the model. Valid values:
|
ParentCategory | String | No | FC-5BD9575E3462**** |
The parent directory ID of the job. |
ResourceList.N.Alias | String | No | demo.jar |
The alias of the resource. |
Adhoc | Boolean | No | false |
Indicates whether the job is a temporary query job. |
ClusterId | String | No | C-A23BD131A862**** |
The ID of the cluster. |
AlertConf | String | No | N/A |
A reserved parameter. |
Response parameters
Parameter | Type | Example | Description |
---|---|---|---|
Id | String | FJ-A23BD131A862**** |
The ID of the job. |
RequestId | String | 1549175a-6d14-4c8a-89f9-5e28300f6d7e |
The ID of the request. |
Examples
Sample requests
http(s)://[Endpoint]/?Action=CreateFlowJob
&Description=This is a data development job Description
&Name=my_shell_job
&ProjectId=FP-257A173659F5****
&RegionId=cn-hangzhou
&ResourceList.1.Path=oss://path/demo.jar
&Type=SHELL
&<common request parameters>
Sample success responses
XML
format
<RequestId>2670BCFB-925D-4C3E-9994-8D12F7A9F538</RequestId>
<Id>FJ-BBCAE48B90CC****</Id>
JSON
Syntax
{
"RequestId": "2670BCFB-925D-4C3E-9994-8D12F7A9F538",
"Id": "FJ-BBCAE48B90CC****"
}