Call the ListJobRuns operation to retrieve a list of Spark jobs.
Try it now
Test
RAM authorization
|
Action |
Access level |
Resource type |
Condition key |
Dependent action |
|
emr-serverless-spark:ListJobRuns |
list |
*All Resource
|
None | None |
Request syntax
GET /api/v1/workspaces/{workspaceId}/jobRuns HTTP/1.1
Path Parameters
|
Parameter |
Type |
Required |
Description |
Example |
| workspaceId |
string |
Yes |
The workspace ID. |
w-d2d82aa09155**** |
Request parameters
|
Parameter |
Type |
Required |
Description |
Example |
| nextToken |
string |
No |
The token that specifies the position from which to start the next read. |
DD6B1B2A-5837-5237-ABE4-FF0C89568980 |
| maxResults |
integer |
No |
The maximum number of entries to return. The maximum value is 100. |
20 |
| name |
string |
No |
The job name. |
emr-spark-demo-job |
| creator |
string |
No |
The UID of the user who created the job. |
150976534701**** |
| jobRunId |
string |
No |
The job run ID. |
j-xxx |
| tags |
array<object> |
No |
The list of tags. |
|
|
object |
No |
A tag object. |
||
| key |
string |
No |
The tag key. |
tag_key |
| value |
string |
No |
The tag value. |
value |
| states |
array |
No |
The job run states. |
["Running","Submitted"] |
|
string |
No |
The state of the job run. Valid values:
|
Running |
|
| startTime |
object |
No |
The time range when the job run started. |
|
| startTime |
integer |
No |
The start of the time range. |
1709740800000 |
| endTime |
integer |
No |
The end of the time range. |
1710432000000 |
| endTime |
object |
No |
The time range when the job run ended. |
|
| startTime |
integer |
No |
The start of the time range. |
1709740800000 |
| endTime |
integer |
No |
The end of the time range. |
1710432000000 |
| resourceQueueId |
string |
No |
The ID of the resource queue on which the Spark job runs. |
dev_queue |
| jobRunDeploymentId |
string |
No |
The deployment ID of the streaming job. |
jd-b6d003f1930f**** |
| regionId |
string |
No |
The region ID. |
cn-hangzhou |
| minDuration |
integer |
No |
The minimum runtime of the job run, in milliseconds. |
60000 |
| isWorkflow |
string |
No |
Specifies whether the job is a workflow task. |
false |
| runtimeConfigs |
string |
No |
The runtime configurations. |
[{\"key\":\"mainClass\",\"value\":\"yourClass\"}] |
| applicationConfigs |
string |
No |
The Spark configurations. |
[{\"key\":\"spark.app.name\",\"value\":\"test\"}] |
Response elements
|
Element |
Type |
Description |
Example |
|
object |
The returned data. |
||
| jobRuns |
array<object> |
A list of Spark jobs. |
|
|
array<object> |
A Spark job object. |
||
| workspaceId |
string |
The workspace ID. |
w-d2d82aa09155**** |
| jobRunId |
string |
The job run ID. |
jr-231231 |
| name |
string |
The job name. |
jobName |
| state |
string |
The state of the job run. |
Running |
| stateChangeReason |
object |
The reason for the state change. |
|
| code |
string |
The error code. |
0 |
| message |
string |
The error message. |
connection refused |
| submitTime |
integer |
The time when the job was submitted. |
1684119314000 |
| endTime |
integer |
The time when the job ended. |
1684119314000 |
| codeType |
string |
The code type of the job. Valid values: SQL JAR PYTHON |
SQL |
| webUI |
string |
The web UI of the job. |
http://spark-ui |
| executionTimeoutSeconds |
integer |
The timeout period for the job execution, in seconds. |
3600 |
| creator |
string |
The UID of the user who created the job. |
150978934701**** |
| tags |
array |
The tags. |
|
| Tag |
A job tag. |
||
| log | RunLog |
The path of the run log. |
|
| releaseVersion |
string |
The version of the Spark engine that is used to run the job. |
esr-3.0.0 (Spark 3.4.3, Scala 2.12, Native Runtime) |
| jobDriver | JobDriver |
The information about the Spark driver. This parameter is not returned by the ListJobRuns operation. |
|
| configurationOverrides |
object |
The advanced Spark configurations. This parameter is not returned by the ListJobRuns operation. |
|
| configurations |
array |
A list of Spark configurations. |
|
| Configuration |
A Spark configuration object. |
||
| displayReleaseVersion |
string |
The display version of the Spark engine that is used to run the job. |
esr-3.0.0 (Spark 3.4.3, Scala 2.12) |
| fusion |
boolean |
Indicates whether the Fusion engine is enabled for acceleration. |
true |
| vcoreSeconds |
integer |
The total number of vCores allocated to the job run, multiplied by the runtime in seconds. |
8236 |
| mbSeconds |
integer |
The total memory in MB allocated to the job run, multiplied by the runtime in seconds. |
33030784 |
| cuHours |
number |
The number of CUs consumed by the job run. This is an estimated value. The actual value is reflected in your bill. |
2.059 |
| resourceQueueId |
string |
dev_queue |
|
| requestId |
string |
The request ID. |
DD6B1B2A-5837-5237-ABE4-FF0C8944**** |
| nextToken |
string |
The token that is used to retrieve the next page of results. |
1 |
| maxResults |
integer |
The maximum number of entries returned for the current request. |
20 |
| totalCount |
integer |
The total number of entries that match the filter criteria. |
200 |
Examples
Success response
JSON format
{
"jobRuns": [
{
"workspaceId": "w-d2d82aa09155****",
"jobRunId": "jr-231231",
"name": "jobName",
"state": "Running",
"stateChangeReason": {
"code": "0",
"message": "connection refused\n"
},
"submitTime": 1684119314000,
"endTime": 1684119314000,
"codeType": "SQL",
"webUI": "http://spark-ui",
"executionTimeoutSeconds": 3600,
"creator": "150978934701****",
"tags": [
{
"key": "workflowId",
"value": "wf-123test"
}
],
"log": {
"driverStdOut": "oss://bucket/path/to/stdout",
"driverStdError": "oss://bucket/path/to/stderr",
"driverSyslog": "oss://bucket/path/to/syslog",
"driverStartup": "oss://bucket/path/to/startup"
},
"releaseVersion": "esr-3.0.0 (Spark 3.4.3, Scala 2.12, Native Runtime)",
"jobDriver": {
"sparkSubmit": {
"entryPoint": "oss://bucket/path/to/entrypoint.jar",
"entryPointArguments": [
"arg1"
],
"sparkSubmitParameters": "--conf spark.app.name=test"
}
},
"configurationOverrides": {
"configurations": [
{
"configFileName": "common.conf",
"configItemKey": "hive.metastore.type",
"configItemValue": "USER_RDS"
}
]
},
"displayReleaseVersion": "esr-3.0.0 (Spark 3.4.3, Scala 2.12)",
"fusion": true,
"vcoreSeconds": 8236,
"mbSeconds": 33030784,
"cuHours": 2.059,
"resourceQueueId": "dev_queue"
}
],
"requestId": "DD6B1B2A-5837-5237-ABE4-FF0C8944****",
"nextToken": "1",
"maxResults": 20,
"totalCount": 200
}
Error codes
See Error Codes for a complete list.
Release notes
See Release Notes for a complete list.