All Products
Search
Document Center

DataWorks:Data push node

Last Updated:Mar 26, 2026

The Data Push Node sends query results from upstream SQL query nodes in a Data Studio workflow to external destinations: DingTalk, Lark, WeCom, Microsoft Teams, and email.

How it works

The Data Push Node retrieves output parameters from upstream SQL query nodes using Input and Output Parameters. These parameters serve as placeholders in the message body, which the node pushes to a configured destination.

image
  1. An upstream SQL query node runs and generates an output parameter named outputs containing the results.

  2. In the downstream Data Push Node, Input and Output Parameters retrieves the outputs parameter from the upstream node and assigns it as an input parameter.

  3. The Data Push Node references the input parameter in the message content and pushes it to the specified destination.

Supported upstream SQL query nodes

Node type Notes
Assignment Node Required when the data source is a MaxCompute SQL Node. The Data Push Node cannot retrieve data directly from a MaxCompute SQL Node — use an Assignment Node as an intermediary.
Hologres SQL Node
ClickHouse SQL Node
EMR Spark SQL Node
EMR Hive Node
MaxCompute Script Node
ADB for PostgreSQL Node
MySQL Node

Prerequisites

Before you begin, make sure you have:

Limitations

Data size limits by destination:

Destination Limit
DingTalk 20 KB per message
Lark 20 KB per message; images must be smaller than 10 MB
WeCom 20 messages per minute per bot
Microsoft Teams 28 KB per message
Email One email body per Data Push task. Additional limits depend on your email service provider's SMTP restrictions.

Available regions: China (Hangzhou), China (Shanghai), China (Beijing), China (Shenzhen), China (Chengdu), China (Hong Kong), Singapore, Japan (Tokyo), US (Silicon Valley), US (Virginia).China (Hangzhou), China (Shanghai), China (Beijing), China (Shenzhen), China (Chengdu), China (Hong Kong), Singapore, Japan (Tokyo), US (Silicon Valley), US (Virginia)

Navigate to Data Studio

  1. Go to the Workspaces page in the DataWorks console. In the top navigation bar, select a region. Find the target workspace and choose Shortcuts > Data Studio in the Actions column.

  2. In the left-side navigation pane, click the image icon to open the Data Studio page.

Create a data push workflow

Create a workflow that includes an SQL query node and a Data Push Node.

  1. Create a Scheduled Workflow and add an SQL query node and a Data Push Node.

  2. In the SQL query node, write a query to retrieve the data you want to push.

    The Data Push Node cannot retrieve data directly from a MaxCompute SQL Node. To use MaxCompute data, create an Assignment Node and write your query there. Other SQL query node types can connect directly.
  3. Configure the SQL query node. In the Scheduling pane, set Computing Resource, Resource Group, and Same-cycle Dependency. If there are no upstream nodes, select Use Workspace Root Node. Output parameter configuration varies by node type:

    Node type Configuration
    Assignment Node Has a default output parameter — no additional configuration required.
    Other SQL query nodes No default output parameter. In the Scheduling pane, go to Input and Output Parameters > Node Output Parameters and click Add Assignment Parameter to pass query results downstream.
  4. Create a Data Push Node and configure the SQL query node as its upstream node.

    1. Click Scheduling. Under Scheduling Dependencies > Same-cycle Dependency, find your SQL query node by name and click Add.

    2. Under Scheduling Policies > Resource Group, select your Serverless Resource Group.

    3. Under Input and Output Parameters > Input Parameters, click Add Parameter and configure the new parameter to use the output parameter of the SQL query node as its value source.

Configure the destination and message content

In the Data Push Node editor, define a push title and configure the destination and message body.

Configure the destination

In the Destination section, select a destination. To create a new destination, click Create Destination. You can also set a data push destination in DataService Studio.

To modify an existing destination, go to DataService Studio > Data Push.

On the Create Destination page, configure the parameters for your destination type:

Webhook

Parameter

Description

Destination

Select DingTalk, Lark, WeCom, or Teams.

Destination Name

Enter a custom name for the destination.

Webhook

Obtain the Webhook URL for your selected destination and enter it here.

Email

Parameter

Description

Destination

Select Email.

SMTP Host

The SMTP server address.

SMTP Port

The SMTP server port. The default is 465.

Sender Address

The sender's email address.

Sender Nickname

Optional. A custom display name for the sender.

SMTP Account

The SMTP login account.

SMTP Password

The password for the SMTP account.

Receiver Address

Recipient email addresses. Separate multiple addresses with commas.

Configure the message body

The message body supports three formats:

Format Description
Markdown Use ${parameter_name} placeholders to include values from the node's Input Parameters.
Table Use field names from the upstream SQL query node's output as parameters to populate the table columns.
Email body A dedicated body for email notifications. Each Data Push task supports only one email body. Email body content is ignored for webhook destinations (DingTalk, Lark, WeCom, Teams).

Run and debug the workflow

After saving the SQL query node and the Data Push Node, click the image icon in the workflow toolbar to run and debug the task.

What's next

  • Configure node scheduling: To run the node on a schedule, configure its properties in the Scheduling pane under Scheduling Policies.

  • Deploy the node: Click the image icon to start the deployment process. Scheduled tasks run only after deployment to the Production Environment.