Obtains the detailed information of a Spark job.

Debugging

OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer. OpenAPI Explorer dynamically generates the sample code of the operation for different SDKs.

Request parameters

Parameter Type Required Example Description
Action String Yes GetJobDetail

The operation that you want to perform. Set the value to GetJobDetail.

JobId String Yes j202010271622hangzhouf742a4330000923

The ID of the Spark job.

VcName String Yes MyCluster

The name of the virtual cluster (VC) on which the job was run.

Response parameters

Parameter Type Example Description
JobDetail Struct

The detailed information of the Spark job.

CreateTime String 2020-10-27 16:23:16

The time when the Spark job started to run. The time zone of the region in which the job was run is used.

CreateTimeValue String 1603786996000

The timestamp when the Spark job started to run, in milliseconds.

Detail String {\"args\":[\"100\"],\"name\":\"SparkPi\",\"className\":\"org.apache.spark.examples.SparkPi\",\"conf\":{\"spark.driver.resourceSpec\":\"medium\",\"spark.executor.instances\":5,\"spark.executor.resourceSpec\":\"medium\"},\"file\":\"local:///tmp/spark-examples.jar\"}

The JSON string that defines the Spark job.

DriverResourceSpec String small

The specifications of the driver. Valid values:

  • small: 1 CPU core and 4 GB of memory.
  • medium: 2 CPU cores and 8 GB of memory.
  • large: 4 CPU cores and 16 GB of memory.
  • xlarge: 8 CPU cores and 32 GB.
ExecutorInstances String 1

The number of executors on which the Spark job runs.

ExecutorResourceSpec String small

The specifications of executors. Valid values:

  • small: 1 CPU core and 4 GB of memory.
  • medium: 2 CPU cores and 8 GB of memory.
  • large: 4 CPU cores and 16 GB of memory.
  • xlarge: 8 CPU cores and 32 GB of memory.
JobId String j202010271622hangzhouf742a4330000923

The ID of the Spark job.

JobName String SparkPi

The name of the Spark job.

LastJobAttemptId String 202105251618hzslot9906b0b40000005-0001

The ID of the Spark job that you last tried to run.

SparkUI String https://dlaui-cn-hangzhou.aliyuncs.com/?token=xxx

The Spark UI from which you can obtain information about the Spark job. For more information, see Configure Spark UI.

Status String success

The status code of the Spark job.

SubmitTime String 2020-10-27 16:23:16

The time when the job was submitted. The time zone of the region in which the job was run is used.

SubmitTimeValue String 1603786996000

The timestamp when the Spark job was submitted, in milliseconds.

UpdateTime String 2020-10-27 16:23:16

The time when the Spark job status was last updated. The time zone of the region in which the job was run is used.

UpdateTimeValue String 1603786996000

The timestamp when the job was last updated, in milliseconds.

VcName String MyCluster

The name of the VC on which the job was run.

RequestId String 5F10AB6E-8984-4E32-B821-4E1512711B8C

The unique ID of the request.

Status codes

Status code

Description

starting

The job is starting and has not been run.

running

The job is being run.

error

A program in the job fails to be executed and an exception is returned.

dead

The job fails due to issues, such as insufficient resources.

killed

The running job is killed.

success

The job succeeds.

Examples

Sample requests

http(s)://[Endpoint]/?Action=GetJobDetail
&JobId=j202010271622hangzhouf742a4330000923
&VcName=MyCluster
&<Common request parameters>

Sample success responses

XML format

<JobDetail>
    <Status>success</Status>
    <VcName>MyCluser</VcName>
    <ExecutorInstances>5</ExecutorInstances>
    <LastJobAttemptId>j202010271622hangzhouf742a4330000923-0001</LastJobAttemptId>
    <SparkUI>https://dlaui-cn-hangzhou.aliyuncs.com/?token=xxx</SparkUI>
    <DriverResourceSpec>medium</DriverResourceSpec>
    <CreateTime>2020-10-27 16:23:16</CreateTime>
    <JobName>SparkPi</JobName>
    <ExecutorResourceSpec>medium</ExecutorResourceSpec>
    <SubmitTime>2020-10-27 16:22:52</SubmitTime>
    <CreateTimeValue>1603786996000</CreateTimeValue>
    <UpdateTimeValue>1603787047000</UpdateTimeValue>
    <SubmitTimeValue>1603786972000</SubmitTimeValue>
    <UpdateTime>2020-10-27 16:24:07</UpdateTime>
    <JobId>j202010271622hangzhouf742a4330000923</JobId>
    <Detail>{"args":["100"],"name":"SparkPi","className":"org.apache.spark.examples.SparkPi","conf":{"spark.driver.resourceSpec":"medium","spark.executor.instances":5,"spark.executor.resourceSpec":"medium"},"file":"local:///tmp/spark-examples.jar"}</Detail>
</JobDetail>
<RequestId>5F10AB6E-8984-4E32-B821-4E1512711B8C</RequestId>

JSON format

{
  "JobDetail": {
    "Status": "success",
    "VcName": "MyCluser",
    "ExecutorInstances": 5,
    "LastJobAttemptId": "j202010271622hangzhouf742a4330000923-0001",
    "SparkUI": "https://dlaui-cn-hangzhou.aliyuncs.com/?token=xxx",
    "DriverResourceSpec": "medium",
    "CreateTime": "2020-10-27 16:23:16",
    "JobName": "SparkPi",
    "ExecutorResourceSpec": "medium",
    "SubmitTime": "2020-10-27 16:22:52",
    "CreateTimeValue": 1603786996000,
    "UpdateTimeValue": 1603787047000,
    "SubmitTimeValue": 1603786972000,
    "UpdateTime": "2020-10-27 16:24:07",
    "JobId": "j202010271622hangzhouf742a4330000923",
    "Detail": "{\"args\":[\"100\"],\"name\":\"SparkPi\",\"className\":\"org.apache.spark.examples.SparkPi\",\"conf\":{\"spark.driver.resourceSpec\":\"medium\",\"spark.executor.instances\":5,\"spark.executor.resourceSpec\":\"medium\"},\"file\":\"local:///tmp/spark-examples.jar\"}"
  },
  "RequestId": "5F10AB6E-8984-4E32-B821-4E1512711B8C"
}

Error codes

For a list of error codes, visit the API Error Center.