All Products
Search
Document Center

E-MapReduce:UpdateProcessDefinitionWithSchedule

Last Updated:Dec 01, 2025

Updates a workflow definition and its timed scheduling.

Try it now

Try this API in OpenAPI Explorer, no manual signing needed. Successful calls auto-generate SDK code matching your parameters. Download it with built-in credential security for local usage.

Test

RAM authorization

The table below describes the authorization required to call this API. You can define it in a Resource Access Management (RAM) policy. The table's columns are detailed below:

  • Action: The actions can be used in the Action element of RAM permission policy statements to grant permissions to perform the operation.

  • API: The API that you can call to perform the action.

  • Access level: The predefined level of access granted for each API. Valid values: create, list, get, update, and delete.

  • Resource type: The type of the resource that supports authorization to perform the action. It indicates if the action supports resource-level permission. The specified resource must be compatible with the action. Otherwise, the policy will be ineffective.

    • For APIs with resource-level permissions, required resource types are marked with an asterisk (*). Specify the corresponding Alibaba Cloud Resource Name (ARN) in the Resource element of the policy.

    • For APIs without resource-level permissions, it is shown as All Resources. Use an asterisk (*) in the Resource element of the policy.

  • Condition key: The condition keys defined by the service. The key allows for granular control, applying to either actions alone or actions associated with specific resources. In addition to service-specific condition keys, Alibaba Cloud provides a set of common condition keys applicable across all RAM-supported services.

  • Dependent action: The dependent actions required to run the action. To complete the action, the RAM user or the RAM role must have the permissions to perform all dependent actions.

Action

Access level

Resource type

Condition key

Dependent action

emr-serverless-spark:UpdateProcessDefinitionWithSchedule

none

*All Resource

*

None None

Request syntax

PUT /dolphinscheduler/projects/{bizId}/process-definition/{code} HTTP/1.1

Path Parameters

Parameter

Type

Required

Description

Example

bizId

string

Yes

The workspace ID.

w-d8********

code

integer

Yes

The workflow definition ID.

12************

Request parameters

Parameter

Type

Required

Description

Example

productNamespace

string

Yes

The product code.

SS

name

string

Yes

The workflow name.

ods_batch_workflow

description

string

No

The workflow description.

ods batch workflow

timeout

integer

No

The default timeout period for the workflow execution.

300

taskDefinitionJson

array<object>

Yes

A JSON array of task definitions. This array contains the descriptive information for all tasks in the workflow.

array<object>

Yes

A JSON object for a task definition. This object contains the descriptive information for a task in the workflow.

code

integer

Yes

The task definition ID.

23************

description

string

No

The description of the task definition.

ods transform task

alertEmailAddress

string

No

The email address for alerts.

foo_bar@spark.alert.invalid.com

startAlertEnable

boolean

No

Specifies whether to enable alerts when the task starts.

true

failAlertEnable

boolean

No

Specifies whether to enable alerts when the task fails.

true

failRetryTimes

integer

No

The number of times to retry the task if it fails.

1

name

string

Yes

The task name.

ods_transform_task

taskParams

object

Yes

The parameters for the task definition.

workspaceBizId

string

Yes

The workspace ID.

w-d8********

taskBizId

string

Yes

The ID of the Data Development task.

TSK-d87******************

resourceQueueId

string

Yes

The name of the queue on which the task runs.

root_queue

sparkDriverCores

integer

No

The number of cores for the Spark driver.

1

sparkExecutorCores

integer

No

The number of cores for each Spark executor.

1

sparkDriverMemory

integer

No

The memory size of the Spark driver.

4g

sparkExecutorMemory

integer

No

The memory size of each Spark executor.

4g

sparkConf

array<object>

No

The Spark task configurations.

object

No

A Spark task configuration.

key

string

No

The key of the Spark configuration.

spark.dynamicAllocation.enabled

value

string

No

The value of the Spark configuration.

true

sparkVersion

string

No

The Spark engine version.

esr-4.0.0 (Spark 3.5.2, Scala 2.12)

sparkLogLevel

string

No

The Spark log level.

INFO

sparkLogPath

string

No

The path to store Spark task logs.

oss://data***/spark/logs

displaySparkVersion

string

No

The display version of the Spark engine.

esr-4.0.0 (Spark 3.5.2, Scala 2.12)

fusion

boolean

No

Specifies whether to enable the Fusion engine for acceleration.

false

environmentId

string

No

The environment ID.

ev-h*************

type

string

No

The Spark job type.

SQL

localParams

array<object>

No

object

No

prop

string

No

direct

string

No

type

string

No

value

string

No

taskType

string

Yes

The type of the workflow node.

EMR-SERVERLESS-SPARK

timeout

integer

No

The default timeout period for the task execution.

30

tags

object

No

The tags.

string

No

A JSON string of tag key-value pairs.

“{\"tagkey\":\"tagvalue\"}”

taskRelationJson

array<object>

Yes

A JSON array that defines the dependencies between tasks in the workflow. `preTaskCode` specifies the upstream task ID, and `postTaskCode` specifies the downstream task ID. Each task must have a unique ID. For a task node without an upstream task, add a dependency and set `preTaskCode` to 0.

object

Yes

A JSON object that defines a task dependency. `preTaskCode` specifies the upstream task ID, and `postTaskCode` specifies the downstream task ID. Each task must have a unique ID. For a task node without an upstream task, add a dependency and set `preTaskCode` to 0.

name

string

Yes

The name of the task topology. You can use the workflow name.

ods batch workflow

preTaskCode

integer

Yes

The upstream task ID.

16************

preTaskVersion

integer

Yes

The upstream task version.

1

postTaskCode

integer

Yes

The downstream task ID.

19************

postTaskVersion

integer

Yes

The downstream task version.

1

executionType

string

Yes

The execution policy.

PARALLEL

alertEmailAddress

string

No

The email address for alerts.

foo_bar@spark.alert.invalid.com

schedule

object

No

The scheduling configuration.

startTime

string

No

The start time.

2024-12-23 16:13:27

endTime

string

No

The end time of the schedule.

2025-12-23 16:13:27

crontab

string

No

The cron expression for scheduling.

0 0 0 * * ?

timezoneId

string

No

The time zone ID.

Asia/Shanghai

retryTimes

integer

No

The number of retries.

1

taskParallelism

integer

No

The degree of concurrent execution for workflow nodes.

1

tags

object

No

The tags.

string

No

A JSON string of tag key-value pairs.

“{\"tagkey\":\"tagvalue\"}”

resourceQueue

string

No

The resource queue.

root_queue

releaseState

string

No

The release state of the workflow.

ONLINE

runAs

string

No

The user to run the workflow.

113***************

publish

boolean

No

Specifies whether to publish the workflow.

true

regionId

string

No

The region ID.

cn-hangzhou

globalParams

array<object>

No

object

No

prop

string

No

value

string

No

direct

string

No

type

string

No

Response elements

Element

Type

Description

Example

object

Schema of Response

requestId

string

The request ID.

DD6B1B2A-5837-5237-ABE4-FF0C8944****

code

integer

The return code from the backend API.

1400009

msg

string

The detailed description that corresponds to the return code.

No permission for resource action

success

string

Indicates whether the request was successful.

true

failed

string

The number of failed operations.

false

data

object

The returned data.

id

string

The workflow definition number.

123223

code

string

The workflow definition ID.

12***********

name

string

The name of the workflow definition.

ods_batch_workflow

version

integer

The version number.

1

releaseState

string

The release state of the workflow.

ONLINE

bizId

string

The workspace ID.

alicloud_ack_one_cluster

description

string

The description of the workflow definition.

1

createTime

string

The time when the workflow definition was created.

2024-09-05T02:03:19Z

updateTime

string

The time when the workflow definition was updated.

2024-03-05T06:24:27Z

userId

string

The ID of the user who scheduled the workflow.

113*********

userName

string

The name of the user who scheduled the workflow.

w-********

projectName

string

The name of the project to which the workflow belongs.

w-********

executionType

string

The execution policy.

SERIAL

alertEmailAddress

string

The email address for alerts.

foo_bar@spark.alert.invalid.com

startTime

string

The start time of the schedule.

0

endTime

string

The end time of the schedule.

1710432000000

timezoneId

string

The time zone ID.

Asia/Shanghai

crontab

string

The cron expression for scheduling.

0 0 0 * * ?

versionHashCode

string

The hash code of the version.

dwerf*********

httpStatusCode

integer

The HTTP status code.

200

Examples

Success response

JSON format

{
  "requestId": "DD6B1B2A-5837-5237-ABE4-FF0C8944****",
  "code": 1400009,
  "msg": "No permission for resource action",
  "success": "true",
  "failed": "false",
  "data": {
    "id": "123223",
    "code": "12***********",
    "name": "ods_batch_workflow",
    "version": 1,
    "releaseState": "ONLINE",
    "bizId": "alicloud_ack_one_cluster",
    "description": "1",
    "createTime": "2024-09-05T02:03:19Z",
    "updateTime": "2024-03-05T06:24:27Z",
    "userId": "113*********",
    "userName": "w-********",
    "projectName": "w-********",
    "executionType": "SERIAL",
    "alertEmailAddress": "foo_bar@spark.alert.invalid.com\n",
    "startTime": "0",
    "endTime": "1710432000000",
    "timezoneId": "Asia/Shanghai\n",
    "crontab": "0 0 0 * * ?\n",
    "versionHashCode": "dwerf*********"
  },
  "httpStatusCode": 200
}

Error codes

See Error Codes for a complete list.

Release notes

See Release Notes for a complete list.