You can call this operation to query the information about a job.
Debugging
Request parameters
Parameter | Type | Required | Example | Description |
---|---|---|---|---|
Action | String | Yes | DescribeFlowJob |
The operation that you want to perform. For API requests using the HTTP or HTTPS URL, this parameter is required. Set the value to DescribeFlowJob. |
Id | String | Yes | FJ-BCCAE48B90CC**** |
The ID of the job. |
ProjectId | String | Yes | FP-257A173659F5**** |
The ID of the project. |
RegionId | String | Yes | cn-hangzhou |
The ID of the region where the streaming domain resides. |
Response parameters
Parameter | Type | Example | Description |
---|---|---|---|
CategoryId | String | FC-5BD9575E3462**** |
The directory ID of the job. |
Description | String | This is the description of a job |
The description of the job. |
EnvConf | String | {"key":"value"} |
The environment variables configured for the job. |
FailAct | String | CONTINUE |
The action to take upon an operation failure of the node instance. Valid values:
|
GmtCreate | Long | 1538017814000 |
The creation time of the job. |
GmtModified | Long | 1538017814000 |
The modification time of the job. |
Id | String | FJ-BCCAE48B90CC**** |
The ID of the job. |
LastInstanceId | String | FJI-0BA97D0BB8F3**** |
The ID of the last executed job instance. |
MaxRetry | Integer | 5 |
The maximum number of retries. Valid values: 0 to 5. |
MonitorConf | String | {"inputs":[{"type":"KAFKA","clusterId":"C-1234567","topics":"kafka_topic","consumer.group":"kafka_consumer_group"}],"outputs":[{"type":"KAFKA","clusterId":"C-1234567","topics":"kafka_topic"}]} |
Monitoring configuration, only SPARK_STREAMING supported by job types. |
Name | String | my_shell_job |
The name of the job. |
ParamConf | String | {"date":"${yyyy-MM-dd}"} |
The configuration parameters of the job. |
Params | String | ls -l |
The content of the job. |
RequestId | String | 1549175a-6d14-4c8a-89f9-5e28300f6d7e |
The ID of the request. |
ResourceList | Array |
The information of the tag resources. |
|
Resource | |||
Alias | String | demo.jar |
The alias of the resource. |
Path | String | oss://path/demo.jar |
The storage path of the resource. |
RetryInterval | Long | 200 |
The retry interval is 0-300 (seconds). |
RunConf | String | {"priority":1,"userName":"hadoop","memory":2048,"cores":1} |
The scheduling parameters configured for the job.
|
Type | String | SHELL |
The type of the job. Valid values: SPARK_SQL, SPARK_STREAMING, MR, SQOOP, PIG, FLINK, STREAMING_SQL, IMPALA_SQL, PRESTO_SQL, SPARK, Hive_SQL, Hive, SHELL, and SPARK_SHELL. |
mode | String | Yarn |
The model mode. Valid values: YARN and LOCAL. |
Examples
Sample request
http(s)://[Endpoint]/? Action=DescribeFlowJob
&Id=FJ-BCCAE48B90CC static website hosting *
&ProjectId=FP-257A173659F5****
&RegionId=cn-hangzhou
&<Common request parameters>
Sample success responses
XML
format
<FailAct>STOP</FailAct>
<CategoryId>FC-F2495319DA05*</CategoryId>
<Description>shell</Description>
<RequestId>7D2B1B2E-8D89-49C1-8D31-097C83879D20</RequestId>
<GmtCreate>1538017814000</GmtCreate>
<GmtModified>1538017814000</GmtModified>
<Params>ls -l</Params>
<ParamConf>{}</ParamConf>
<LastInstanceId>FJI-0BA97D0BB8F3****</LastInstanceId>
<MaxRetry>0</MaxRetry>
<MaxRunningTimeSec>0</MaxRunningTimeSec>
<Name>shell_copy</Name>
<Type>SHELL</Type>
<ResourceList>
</ResourceList>
<RetryInterval>15</RetryInterval>
<Id>FJ-C7FB9F1075C7****</Id>
<CustomVariables>[]</CustomVariables>
<AlertConf>{}</AlertConf>
JSON
format
{
"FailAct": "STOP",
"CategoryId": "FC-F2495319DA05*",
"Description": "shell",
"RequestId": "7D2B1B2E-8D89-49C1-8D31-097C83879D20",
"GmtCreate": 1538017814000,
"GmtModified": 1538017814000,
"Params": "ls -l",
"ParamConf": "{}",
"LastInstanceId": "FJI-0BA97D0BB8F3****",
"MaxRetry": 0,
"MaxRunningTimeSec": 0,
"Name": "shell_copy",
"Type": "SHELL",
"ResourceList": {
"Resource": []
},
"RetryInterval": 15,
"Id": "FJ-C7FB9F1075C7****",
"CustomVariables": "[]",
"AlertConf": "{}"
}
Error codes
For a list of error codes, visit the API Error Center.