In data workflows, task code such as SQL often needs to change dynamically with the schedule time to process different data partitions. You can use scheduling parameters to avoid changing code manually. Set placeholders in your code. The system then automatically replaces these placeholders with dynamic values, such as the data timestamp and the scheduled runtime. This automates and parameterizes your workflow.
Core configuration process
To use scheduling parameters, define them and assign values in the Scheduling Configurations section. After you test the code, submit it to Operation Center. The system then automatically schedules the task and dynamically replaces the scheduling parameters with their configured values.
Step | Action | Core Goal |
1. Define parameters | In the node code, you can define one or more parameters using the | This reserves a placeholder for a dynamic value. |
2. Configure parameters | In the panel for the node, assign values to the variables in the code. | This associates the placeholder |
3. Test | You can use the Smoke Testing feature to simulate a specific data timestamp and verify the correctness of parameter replacement and code execution. | This ensures that the configuration is correct in the development environment. |
4. Publish and verify | Submit the node to the production environment, and then confirm the final parameter configuration in Operation Center. | This ensures that the parameters of the production task meet expectations. |
Steps
1. Define parameters

Double-click the target node, such as an ODPS SQL node, to open the node editor.
Define parameters in the code. In the code of an ODPS SQL node or another SQL node, you can use the
${param}syntax to define a parameter name. DataWorks recommends that you use meaningful parameter names for easier reference and management.Scheduling parameter call formats:
Format type
Call syntax
Scope
Notes
General format
${ParameterName}Applies to most node types, such as ODPS SQL and synchronization nodes.
This is the most common format.
Special format
Varies by node. Not in the
${...}format.PyODPS, Shell
For more information, see Examples of scheduling parameter configurations for different node types.
-- Example: Define a variable named pt_date for partition filtering SELECT * FROM my_table WHERE ds = '${pt_date}';On the right side of the page, click Scheduling Configurations to go to the Scheduling Parameters section.
Configure the scheduling parameters as described in the next section.
2. Configure parameters
You can set scheduling parameters in two ways: Visual Definition and Define By Expression. You can switch between these modes by clicking Define By Expression in the upper-right corner of the parameter list. The default mode is Visual Definition.
Configure parameters
Visual definition
Click Scheduling Configurations to the right of the node to open the scheduling parameter configuration interface.

Add parameters
To the right of the node, click Scheduling Configurations. In the Scheduling Parameters section, you can add parameters in one of the following two ways.
Click Add Parameter and manually enter the parameter name and value. The parameter name must match the variable name defined in the code.
Click Load Parameters From Code. DataWorks automatically parses variables in the code, such as
${pt_date}, and adds them as parameters. Then, enter a value for each parameter.
Assign values to parameters
You can set built-in system variables, custom time variables, and constants.
Click the input box. The drop-down list shows some common parameter expressions that you can select directly. You can also manually enter a custom expression or a built-in system variable.
Enter values as needed. For more information about the supported range of parameter values, see Supported formats for scheduling parameters.
Define by expression
Click Define By Expression to configure parameters using expressions.

When you use an expression to define multiple parameters, you must separate them with spaces.
When you use the Define By Expression method to add, delete, or modify scheduling parameters, DataWorks validates the expression syntax. You cannot configure the scheduling parameter if the syntax is invalid.
For example, DataWorks checks for syntax rules such as no spaces are allowed on either side of the equal sign.
Parameter preview
After you define the parameters, click Scheduling Parameter Preview to preview the parameters for the next N instances that run after a specified data timestamp. This helps you verify that the parameter definitions are configured as expected. You can adjust the data timestamp and the number of instances for the preview.

Some nodes, such as offline synchronization nodes, have a built-in ${bizdate} parameter. This parameter is automatically assigned the value $bizdate. You can replace the bizdate parameter name in the code with a custom one. The ${bizdate} parameter itself has no special meaning and is the same as any other custom parameter.

3. Smoke testing
After you assign values to the scheduling parameters, you can use the smoke testing feature. You can configure a data timestamp to simulate the scheduling scenario for the target node. You can verify that the code execution and parameter replacement work as expected in this scenario. If they do not, you must adjust the settings as needed to prevent issues with normal task scheduling.
Smoke testing generates instances and incurs instance fees. For more information about instance fees, see Billing of serverless resource groups.
Submit the node code.
Configure the schedule time and scheduling dependencies.
Click the save icon
to save the code and configuration. Then, click the submit icon
. You can use the smoke testing feature in the development environment only after the latest code for the node is submitted to Operation Center.NoteIf you find that the smoke test is not running the latest code or parameters, you must submit the node again.
Run a smoke test.
Click the
smoke testing icon in the toolbar. In the Smoke Testing dialog box, select a data timestamp and click OK to run the smoke test.View smoke test logs.
In the Smoke Test Records window, find the latest record and click View Log.

In the log, check the parameter output to confirm that it meets your expectations.
NoteIf you accidentally close the window, you can click the
smoke test records icon in the toolbar to reopen it.
The Run
and Advanced Run
features require you to manually assign constants to variables in the code. Therefore, you cannot use these features to verify whether the configured scheduling parameters work as expected.
4. Publish and verify
After verification in the development environment, you can submit and publish the task to Operation Center for production and automatic scheduling. After you publish the task, you must check the scheduling parameters in the production environment to prevent runtime errors.
If the scheduling parameter configuration of the auto triggered task is not as expected, or if you cannot find the target task in Operation Center, you must confirm that the task was published successfully. For more information about how to publish tasks, see Publish a task.
Check parameter definitions.
Go to Operation Center, switch to the destination region and workspace, and navigate to the page. In the task list, click the task name and verify that the execution parameters in the Properties panel are correct.

Smoke testing in Operation Center.
In Operation Center, you can also use smoke testing to confirm whether a submitted and published task performs parameter replacement and code execution as expected in the production environment. For more information, see Run a test and view the test instance.
ImportantNote that smoke testing will actually execute on production data. You must proceed with caution to avoid contaminating the production database.

Observe actual scheduling results.
After the task is automatically scheduled, you can further verify that the parameters were replaced as required by checking them in the Recurring Instance.

Complete configuration example
This topic uses an ODPS SQL node as an example to show how to use the Smoke Testing Feature in the Development Environment to test whether the configured scheduling parameters work as expected. It also shows how to view the scheduling parameter configuration of the task in Operation Center after the node is published.
For more information about how to configure scheduling parameters for different types of nodes, see Examples of scheduling parameter configurations for different node types.
Edit the node code and configure scheduling parameters.
The following figure shows the code and scheduling parameter configuration of the ODPS SQL node.

Define variables in the code.
-- Assign built-in system parameters SELECT '${var1}'; SELECT '${var2}'; -- Assign custom parameters SELECT '${var3}'; SELECT '${var4}'; -- Assign a constant SELECT '${var5}';Assign values to the variables.
In the section, you can assign values to the variables as shown in Area 2. For more information about value formats, see Supported formats for scheduling parameters.
var1=$bizdate: the data timestamp inyyyymmddformat.var2=$cyctime: the scheduled runtime of the task inyyyymmddhh24missformat.var3=${yyyymmdd}: the data timestamp inyyyymmddformat.var4=$[yyyymmddhh24miss]: the scheduled runtime of the task inyyyymmddhh24missformat.var5=Hangzhou: sets the value of var5 to the constant Hangzhou.
Optional: Configure the schedule time.
Configure the ODPS SQL node to be scheduled hourly (as shown in Area 3).
NoteYou can configure the time period as needed. This topic uses an example where a time period is added.
Start time is
16:00.End time is
23:59.Interval is
1hour.
For more information about time period configuration, see Time property configuration instructions.
Set scheduling dependencies.
Configure scheduling dependencies for the development node. For more information, see Configure scheduling dependencies. In this example, the root node is used as the upstream dependency for this node.
In the toolbar at the top of the node editor, click the
save icon and then the
submit icon to save and submit the configuration of the ODPS SQL node.Run a smoke test in the development environment.
Click the
icon. In the Smoke Testing dialog box, you can configure the business time to simulate the scheduling period for the node.
The business time is configured as follows:Data Timestamp:
2025-10-16Start Time:
16:00End Time:
17:00
The ODPS SQL task is an hourly scheduled task. Two instances are generated for the task at
16:00and17:00on2025-10-17.NoteBecause the data timestamp is one day before the runtime, the actual runtime of the task is
2025-10-17.The expected values for the
16:00node are as follows:var1=20251016.var2=20251017160000.var3=20251016.var4=20251017160000.
The expected values for the
17:00node are as follows:var1=20251016.var2=20251017170000.var3=20251016.var4=20251017170000.
Click OK. The node is scheduled to run at the specified time.
After the runtime ends, click the
icon to view the smoke test logs.The two instances that are generated by the node run successfully, and the node's execution result meets expectations.



If your workspace is in standard mode, you need to publish the node to the production environment. On the ODPS SQL node editor page, click Publish in the upper-right corner of the top menu bar. For more information about publishing a node, see Publish a node.
Go to Operation Center to confirm the scheduling parameter configuration of the node.

On the top menu bar of DataStudio, click Operation Center in the upper-right corner to open the Operation Center page.
On the page, you can search for the target node.
NoteYou can search for the node on the Recurring Tasks page only after it is successfully published.
Click the target node name and view the Execution Parameters on the Properties tab.
In this example, the execution parameters of the node are
var1=$bizdate var2=$cyctime var3=${yyyymmdd} var4=$[yyyymmddhh24miss], which meets expectations.After a scheduled instance is generated, click the Recurring Instance menu, search for the task name, and click the task instance name. On the Properties tab, you can view the replaced parameters under Execution Parameters.
In this example, the execution parameters of the node are
var1=20251016 var2=20251017160000 var3=20251016 var4=20251017160000, which meets expectations.