All Products
Search
Document Center

E-MapReduce:GetJobRun

Last Updated:Nov 02, 2025

Retrieves the details of a job.

Try it now

Try this API in OpenAPI Explorer, no manual signing needed. Successful calls auto-generate SDK code matching your parameters. Download it with built-in credential security for local usage.

Test

RAM authorization

The table below describes the authorization required to call this API. You can define it in a Resource Access Management (RAM) policy. The table's columns are detailed below:

  • Action: The actions can be used in the Action element of RAM permission policy statements to grant permissions to perform the operation.

  • API: The API that you can call to perform the action.

  • Access level: The predefined level of access granted for each API. Valid values: create, list, get, update, and delete.

  • Resource type: The type of the resource that supports authorization to perform the action. It indicates if the action supports resource-level permission. The specified resource must be compatible with the action. Otherwise, the policy will be ineffective.

    • For APIs with resource-level permissions, required resource types are marked with an asterisk (*). Specify the corresponding Alibaba Cloud Resource Name (ARN) in the Resource element of the policy.

    • For APIs without resource-level permissions, it is shown as All Resources. Use an asterisk (*) in the Resource element of the policy.

  • Condition key: The condition keys defined by the service. The key allows for granular control, applying to either actions alone or actions associated with specific resources. In addition to service-specific condition keys, Alibaba Cloud provides a set of common condition keys applicable across all RAM-supported services.

  • Dependent action: The dependent actions required to run the action. To complete the action, the RAM user or the RAM role must have the permissions to perform all dependent actions.

Action

Access level

Resource type

Condition key

Dependent action

emr-serverless-spark:GetJobRun

get

*All Resource

*

None

None

Request syntax

GET /api/v1/workspaces/{workspaceId}/jobRuns/{jobRunId} HTTP/1.1

Path Parameters

Parameter

Type

Required

Description

Example

workspaceId

string

Yes

The workspace ID.

w-d2d82aa09151****

jobRunId

string

Yes

The job run ID.

jr-93d98d2f7061****

Request parameters

Parameter

Type

Required

Description

Example

regionId

string

No

The region ID.

cn-hangzhou

Response elements

Element

Type

Description

Example

object

The returned data.

jobRun

object

The details of the job run.

workspaceId

string

The workspace ID.

w-d2d82aa09155****

jobRunId

string

The job run ID.

jr-93d98d2f7061****

name

string

The job name.

jobName

state

string

The job status.

Running

stateChangeReason

object

The reason for the state change.

code

string

The error code.

ERR-100000

message

string

The error message.

connection refused

submitTime

integer

The time when the job was submitted.

1684119314000

endTime

integer

The time when the job ended.

1684122914000

codeType

string

The code type of the job. Valid values:

  • SQL

  • JAR

  • PYTHON

SQL

webUI

string

The web UI for the job.

http://spark-ui

executionTimeoutSeconds

integer

The timeout period for the job run, in seconds.

3600

resourceOwnerId

string

The user ID (UID) of the user who created the job.

150978934701****

tags

array

The tags.

Tag

A tag.

log

RunLog

The path of the job run log.

releaseVersion

string

The version of the Spark engine used to run the job.

esr-3.3.1

resourceQueueId

string

The name of the queue where the job runs.

root_queue

jobDriver

JobDriver

The information about the Spark Driver.

configurationOverrides

object

The Spark job configuration.

configurations

array

The configurations.

Configuration

The configuration information.

displayReleaseVersion

string

The version displayed in the console.

esr-4.0.0 (Spark 3.5.2, Scala 2.12)

fusion

boolean

Indicates whether the Fusion engine acceleration feature is enabled.

false

environmentId

string

The environment ID.

env-cpv569tlhtgndjl8****

notebookAccessUrl

string

http://workflow-ide-cn-hangzhou.oss-cn-hangzhou.aliyuncs.com/spark-notebook-output/w-xxxxxxxxx/xxxxxxx

requestId

string

The request ID.

DD6B1B2A-5837-5237-ABE4-FF0C8944****

Examples

Success response

JSON format

{
  "jobRun": {
    "workspaceId": "w-d2d82aa09155****",
    "jobRunId": "jr-93d98d2f7061****",
    "name": "jobName",
    "state": "Running",
    "stateChangeReason": {
      "code": "ERR-100000",
      "message": "connection refused"
    },
    "submitTime": 1684119314000,
    "endTime": 1684122914000,
    "codeType": "SQL",
    "webUI": "http://spark-ui",
    "executionTimeoutSeconds": 3600,
    "resourceOwnerId": "150978934701****",
    "tags": [
      {
        "key": "workflowId",
        "value": "wf-123test"
      }
    ],
    "log": {
      "driverStdOut": "oss://bucket/path/to/stdout",
      "driverStdError": "oss://bucket/path/to/stderr",
      "driverSyslog": "oss://bucket/path/to/syslog",
      "driverStartup": "oss://bucket/path/to/startup"
    },
    "releaseVersion": "esr-3.3.1",
    "resourceQueueId": "root_queue",
    "jobDriver": {
      "sparkSubmit": {
        "entryPoint": "oss://bucket/path/to/entrypoint.jar",
        "entryPointArguments": [
          "arg1"
        ],
        "sparkSubmitParameters": "--conf spark.app.name=test"
      }
    },
    "configurationOverrides": {
      "configurations": [
        {
          "configFileName": "common.conf",
          "configItemKey": "hive.metastore.type",
          "configItemValue": "USER_RDS"
        }
      ]
    },
    "displayReleaseVersion": "esr-4.0.0 (Spark 3.5.2, Scala 2.12)",
    "fusion": false,
    "environmentId": "env-cpv569tlhtgndjl8****",
    "notebookAccessUrl": "http://workflow-ide-cn-hangzhou.oss-cn-hangzhou.aliyuncs.com/spark-notebook-output/w-xxxxxxxxx/xxxxxxx"
  },
  "requestId": "DD6B1B2A-5837-5237-ABE4-FF0C8944****"
}

Error codes

See Error Codes for a complete list.

Release notes

See Release Notes for a complete list.