Call ModifyFlowJob to modify a data development job.


OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer. OpenAPI Explorer dynamically generates the sample code of the operation for different SDKs.

Request parameters

Parameter Type Required Example Description
Action String Yes ModifyFlowJob

The operation that you want to perform. Set the value to ModifyFlowJob.

Id String Yes FJ-BCCAE48B90CC****

The job ID. You can call ListFlowJob View the scaling group ID.

ProjectId String Yes FP-257A173659F5****

The project ID. You can call ListFlowProject View the ID of the project.

RegionId String Yes cn-hangzhou

The region ID. You can call DescribeRegions View the latest list of Alibaba Cloud regions.

ResourceList.N.Path String Yes oss://path/demo.jar

The path of OSS or HDFS.

Name String No my_shell_job

The name of the modified job.

Description String No This is the description of a job

The description of the modified job.

FailAct String No CONTINUE

The processing method when the job fails. The value is as follows:

  • CONTINUE: Skip this job
  • STOP: stop working
MaxRetry Integer No 5

The maximum number of retries. The value ranges from 0 to 5.

RetryPolicy String No None.

retry policy, retain parameters.

MaxRunningTimeSec Long No 0

Retain parameters.

RetryInterval Long No 200

The retry interval, which ranges from 0 to 300 (seconds).

Params String No ls -l

The content of the job.

ParamConf String No {"date":"${yyyy-MM-dd}"}

The configuration parameters of the job.

CustomVariables String No {\"scope\":\"PROJECT\",\"entityId\":\"FP-80C2FDDBF35D9CC5\",\"variables\":[{\"name\":\"v1\",\"value\":\"1\",\"properties\":{\"password\":true}}]}

The custom variables configured for the job.

EnvConf String No {"key":"value"}

The environment variables configured for the job.

Note The maximum length of the entire JSON string is 1024 bytes.
RunConf String No {"priority":1,"userName":"hadoop","memory":2048,"cores":1}

Run the configuration, the value is as follows:

  • priority: priority
  • userName: the Linux user who submitted the job
  • memory: memory, in MB
  • cores: number of cores
MonitorConf String No {"inputs":[{"type":"KAFKA","clusterId":"C-1234567","topics":"kafka_topic","":"kafka_consumer_group"}],"outputs":[{"type":"KAFKA","clusterId":"C-1234567","topics":"kafka_topic"}]}

Monitoring configuration, only SPARK_STREAMING Type jobs support monitoring configurations.

Mode String No YARN

Model mode, with the following values:

  • YARN: Package the job as a Launcher and submit it to YARN for execution.
  • LOCAL: submits the job from a header or gateway node
ResourceList.N.Alias String No demo.jar

The alias of the resource.

ClusterId String No C-A23BD131A862****

The ID of a cluster.

AlertConf String No None.

Retain parameters.

Response parameters

Parameter Type Example Description
Data Boolean true

Indicates whether SQL audit was disabled for the DRDS database. Valid values:

  • true: The operation was successful.
  • false: The operation failed.
RequestId String 549175a-6d14-4c8a-89f9-5e28300f6d7e

The ID of the request.


Sample requests

&Description=This is a data development job Description
&Params=ls -l
&<common request parameters>

Sample success responses

XML format


JSON Syntax