A data push node queries data from upstream nodes in a workflow and pushes the results to DingTalk groups, Lark groups, WeCom groups, Microsoft Teams, or email. Groups and teams receive the latest data automatically after each scheduled run.
How it works
A data push node reads the output parameters of its ancestor nodes and uses those values as its own input parameters. You reference those values in the push content using placeholders.
Two ancestor node types are supported:
SQL query node — queries a data source and exposes query results as output parameters. Reference fields using
${Field name}in the push content.Assignment node — runs custom logic (ODPS SQL, Shell, or Python) and passes output to the push node. Reference values using
${Input parameter name}in the push content.
Channel capabilities
Before you configure a destination, check the size limits that apply to your target channel:
| Channel | Limit |
|---|---|
| DingTalk | 20 KB per message |
| Lark | 20 KB per message; images must be less than 10 MB |
| WeCom | 20 messages per chatbot per minute |
| Microsoft Teams | 28 KB per message |
| One email body per data push task |
Format guidance:
| Format | Use when | Ancestor node |
|---|---|---|
| Markdown | You want a narrative message with inline data values | SQL query node or assignment node |
| Table | You want to display query results as a structured grid | SQL query node only |
Markdown example — the push body might look like:
## Daily sales report
- Region: ${region}
- Total orders: ${total_orders}
- Revenue: ${revenue}
- Report date: ${report_date}Prerequisites
Before you begin, make sure you have:
An activated DataWorks service. See Purchase.
A DataWorks workspace. See Create a workspace.
A workflow in the workspace.
A serverless resource group created on or after June 28, 2024. Only serverless resource groups are supported. For setup instructions, see Use serverless resource groups.
Internet access enabled for the resource group. See Overview of network connectivity solutions.
If your serverless resource group was created before June 28, 2024, submit a ticket to upgrade it before proceeding.
Limitations
Email: Only one email body can be added per data push task.
Email SMTP: Additional limits depend on the email service you use. Check the Simple Mail Transfer Protocol (SMTP) limits of your provider.
Supported regions: The data push feature is available only in the following regions: China (Hangzhou), China (Shanghai), China (Beijing), China (Shenzhen), China (Chengdu), China (Hong Kong), Singapore, Malaysia (Kuala Lumpur), US (Silicon Valley), US (Virginia), and Germany (Frankfurt).
Step 1: Create an ancestor node
A data push node cannot query data on its own — it relies on an ancestor node to produce the data. Create either an SQL query node or an assignment node first.
To push MaxCompute data, use an assignment node, not an SQL query node. See MaxCompute data push.
Create an SQL query node
Log on to the DataWorks console. In the top navigation bar, select your region. In the left-side navigation pane, choose Data Development and O&M > Data Development. Select your workspace and click Go to Data Development.
In DataStudio, double-click your workflow. On the workflow configuration tab, click the
icon and select the node type that matches your data source. In the Create Node dialog box, configure the node parameters and click Confirm.Double-click the created SQL query node and write the query code.
You cannot push data from an ODPS SQL node directly. Create an assignment node instead and write the SQL statement there. See Configure data push flows in the workflow.
In the right-side navigation pane, click the Properties tab and configure the basic, time, resource, dependency, and context settings. See Configure basic properties, Configure time properties, Configure the resource property, Configure same-cycle scheduling dependencies, and Configure node context.
On the Properties tab, click the drop-down arrow next to Input and Output Parameters. Next to Output Parameters, click Add assignment parameter to add the
outputsparameter.In the top toolbar, click the
icon to save.
Create an assignment node
Log on to the DataWorks console. In the top navigation bar, select your region. In the left-side navigation pane, choose Data Development and O&M > Data Development. Select your workspace and click Go to Data Development.
In DataStudio, double-click your workflow. On the workflow configuration tab, click the
icon and select Assignment Node in the General section. In the Create Node dialog box, configure the node parameters and click Confirm.Double-click the created assignment node. In the Language drop-down list, select ODPS SQL, SHELL, or Python and write the node code. See Assignment node.
In the top toolbar, click the
icon to save.
Step 2: Create a data push node
In DataStudio, double-click your workflow. On the workflow configuration tab, click the
icon and select Data Push in the General section. In the Create Node dialog box, configure the following parameters and click Confirm.Parameter Description Node type Select Data Push from the drop-down list Path Select the same path as the ancestor node created in Step 1 Name Enter a name based on your business requirements Double-click the created data push node to open its configuration tab.
Add the ancestor node as a parent node. In the right-side navigation pane, click Properties. In the Dependencies section, select Node Name from the drop-down list under Parent Nodes, enter the name of the ancestor node, and click Create.
In the Resource Group section of the Properties tab, select a serverless resource group created on or after June 28, 2024.
Add the
outputsparameter of the ancestor node as an input parameter of the data push node. On the Properties tab, click the drop-down arrow next to Input and Output Parameters. Click Create next to Input Parameters, enter a parameter name in the Parameter Name column, and select theoutputsparameter from the Value Source drop-down list. Close the Properties tab.Configure the Destination, Title, and Body for the push node. Destination Select a destination from the Destination drop-down list. If no destination is available, click Create Destination and fill in the following: For Lark webhooks, see Configure a Lark Webhook trigger. For Teams webhooks, see Create incoming webhooks with Workflows for Microsoft Teams. To manage existing destinations, go to DataService Studio > Service Development tab. In the lower-left corner, click the
icon, then click the Destination Management tab. See the Create a webhook destination section in "Data push." Title Enter a title for the message. Body Click Add and select Markdown or Table: See the Configure the push content section in "Data push" for more details.Markdown — write free-form content and embed data values as placeholders:
If the ancestor node is an SQL query node, use
${Field name}(the field names returned by the query).If the ancestor node is an assignment node, use
${Input parameter name}(the input parameter names defined on the data push node).
Table — select the fields from the SQL query node output to display as a table. This option is available only when the ancestor node is an SQL query node.
Parameter Description Type The push channel: DingTalk, Lark, WeCom, Microsoft Teams, or Email Destination name A name based on your business requirements WebHook The chatbot webhook URL for DingTalk, Lark, or WeCom; the incoming webhook URL for Microsoft Teams; or the SMTP address for Email. Get these from the respective platform. In the top toolbar, click the
icon to save.
Step 3: Test, commit, and deploy
After configuring the workflow, test all data push flows before deploying.
On the workflow configuration tab, click the
icon to run the workflow.When the
icon appears next to all nodes, click the
icon to commit.In the Commit dialog box, select the nodes to commit, enter a description, and select Ignore I/O Inconsistency Alerts.
Click Confirm.
Deploy the nodes. See Publish tasks.
Best practices
DataWorks supports several data push patterns for different scenarios. See Best practice for configuring data push nodes in a workflow for examples covering simple data push, combined data push, script data push, conditional data push, and MaxCompute data push.
What's next
After all nodes are deployed, monitor and manage them in Operation Center. See Perform basic O&M operations on auto triggered nodes.