The data push node sends query results from Data Studio workflows to DingTalk, Lark, WeCom, Teams, or email, keeping your team instantly updated on the latest data.
Background
The data push node retrieves output parameters from upstream SQL query nodes (like assignment nodes, Hologres SQL nodes, or ClickHouse SQL nodes) via Input and Output Parameters. You can embed these parameters as placeholders in your message content, then push to your chosen destination.
The process:
Upstream SQL node completes and generates an
outputsparameter containing query results.Data push node retrieves the
outputsparameter via Input and Output Parameters and binds it as a node input parameter.Configure and send by referencing input parameters in your message, setting the destination, and pushing.
Supported upstream SQL nodes:
Assignment node: Use this for MaxCompute SQL queries (Direct connections to MaxCompute SQL nodes aren't supported yet).
Other SQL nodes: Hologres SQL node, ClickHouse SQL node, EMR Spark SQL nodes, EMR Hive node, MaxCompute Script node, AnalyticDB for PostgreSQL node, and MySQL node.
Prerequisites
You have created a workspace.
You have added a serverless resource group and bound it to your DataWorks workspace.
Limitations
Size and rate Limits:
Destination
Limit
DingTalk
Max 20KB per message.
Lark
Max 20KB per message; images must be under 10MB.
WeCom
Max 20 messages per minute per bot.
Teams
Max 28KB per message.
Email
One email body per task; see the SMTP limits from your email provider for additional restrictions.
Supported regions for data push:
China (Hangzhou), China (Shanghai), China (Beijing), China (Shenzhen), China (Chengdu), China (Hong Kong), Singapore, US (Silicon Valley), US (Virginia).
Access Data Studio
Go to the Workspaces page in the DataWorks console. In the top navigation bar, select a desired region. Find the desired workspace and choose in the Actions column.
In the left navigation pane, click the
icon.
Create your workflow
Create a scheduled workflow.
Configure the SQL query node and add your SQL query code.
NoteData push nodes can't directly access ODPS SQL. To query MaxCompute data, create an assignment node instead.
Set up Scheduling on the right side of the node by selecting your Computing Resource, Resource Group, and selecting upstream nodes for Same-cycle Dependency (or choose Use Workspace Root Node if none exist). Configure output parameters:
Node type
Configuration
Assignment node
Output parameters are created automatically; no setup needed
Other SQL nodes
Manually add output parameters: Go to then click Add Assignment Parameter.
Create a data push node and configure it as the downstream node.
Click Scheduling, and in , click Add and select your SQL query node as the upstream dependency.
In , select the created serverless resource group.
In , click Create Parameter. Bind the output parameter from the SQL node as the input parameter for the data push node.
Configure your push
Set up destinations
In the data push node, select Destination. If your destination doesn't exist yet, click Create Destination or configure push targets in DataService Studio.
NoteTo modify existing push targets, go to .
Choose your destination type:
For DingTalk, Lark, WeCom, or Teams
Parameter | Description | |
Destination | Select DingTalk, Lark, WeCom, or Teams. | |
Destination Name | Choose a descriptive name. | |
Webhook | Paste the webhook URL from your messaging platform: | |
For email
Parameter | Description |
Destination | Select Email. |
SMTP Host | Your email server address. |
SMTP Port | Server port (default: 465). |
Sender Address | Email address used to send messages. |
Sender Nickname | Optional display name. |
SMTP Account | Login username for your SMTP server. |
SMTP Password | Authentication password. |
Receiver Address | Email recipients (comma-separated for multiple addresses). |
Configure your message
You can format content using Markdown, tables, or email bodies:
Markdown: Reference input parameters using
${parameter_name}placeholdersTables: Use field names from upstream SQL nodes as Parameters
Email bodies:
Limit one email body per data push task.
Email bodies render only when the destination is email (hidden in webhook messages).
Test
After configuring and saving both nodes, click the
icon in the toolbar to test your data push task.
What to do next
Node scheduling: To run nodes on a recurring schedule, configure the Scheduling policies in Scheduling.
Node deployment: To run tasks in production, click the
icon to start the deployment process. Nodes only execute on schedule after deploying to production.