You can call the ModifyFlowJob operation to modify a data development job.


OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer. You can use OpenAPI Explorer to search for API operations, call API operations, and dynamically generate SDK sample code.

Request parameters

Parameter Type Required Example Description
Action String Yes ModifyFlowJob

The operation that you want to perform. This parameter is required for API requests that you create by piecing together HTTP or HTTPS URLs. Set the value to ModifyFlowJob.

Id String Yes FJ-BCCAE48B90CC****

The ID of the job. You can call ListFlowJob View the scaling group ID.

ProjectId String Yes FP-257A173659F5****

The ID of the project. You can call ListFlowProject View the ID of the project.

RegionId String Yes cn-hangzhou

The region ID of the instance. You can call DescribeRegions To view the latest list of Alibaba Cloud regions.

ResourceList.N.Path String Yes oss://path/demo.jar

The path of OSS or HDFS.

Name String No my_shell_job

The name of the job after the modification.

Description String No This is the description of a job

The description of the modified job.

FailAct String No CONTINUE

The handling method when the job Fails. Valid values:

  • CONTINUE: Skip
  • STOP: stops working.
MaxRetry Integer No 5

The maximum number of retries. Valid values: 0 to 5.

RetryPolicy String No None

Retry policy and retain the parameters.

MaxRunningTimeSec Long No 0

A reserved parameter.

RetryInterval Long No 200

The interval between retries. Valid values: 0 to 300, in seconds.

Params String No ls -l

The content of the job.

ParamConf String No {"date":"${yyyy-MM-dd}"}

The configuration parameters of the job.

CustomVariables String No {\"scope\":\"PROJECT\",\"entityId\":\"FP-80C2FDDBF35D9CC5\",\"variables\":[{\"name\":\"v1\",\"value\":\"1\",\"properties\":{\"password\":true}}]}

The custom variables configured for the job.

EnvConf String No {"key":"value"}

The environment variables configured for the job.

RunConf String No {"priority":1,"userName":"hadoop","memory":2048,"cores":1}

The running configuration. Values are as follows:

  • priority: the priority
  • userName: Linux user who submits the job.
  • memory: memory, in MB
  • cores: the number of cores
MonitorConf String No {"inputs":[{"type":"KAFKA","clusterId":"C-1234567","topics":"kafka_topic","":"kafka_consumer_group"}],"outputs":[{"type":"KAFKA","clusterId":"C-1234567","topics":"kafka_topic"}]}

Monitoring configuration, only SPARK_STREAMING Type job support.

Mode String No YARN

The mode of the model. The following values are supported:

  • YARN: submits the job from a worker node
  • LOCAL: submits the job from a header or gateway node
ResourceList.N.Alias String No demo.jar

The alias of the resource.

ClusterId String No C-A23BD131A862****

The ID of the cluster.

AlertConf String No None

A reserved parameter.

Response parameters

Parameter Type Example Description
Data Boolean true

Indicates whether SQL audit was disabled for the DRDS database. Valid values:

  • true: The operation was successful.
  • false: The operation failed.
RequestId String 549175a-6d14-4c8a-89f9-5e28300f6d7e

The ID of the request.


Sample requests

&Description=This is a data development job Description
&Params=ls -l
&<common request parameters>

Sample success responses

XML format


JSON Syntax