All Products
Search
Document Center

DataWorks:Data push node

Last Updated:Oct 15, 2025

The data push node sends query results from Data Studio workflows to DingTalk, Lark, WeCom, Teams, or email, keeping your team instantly updated on the latest data.

Background

The data push node retrieves output parameters from upstream SQL query nodes (like assignment nodes, Hologres SQL nodes, or ClickHouse SQL nodes) via Input and Output Parameters. You can embed these parameters as placeholders in your message content, then push to your chosen destination.

image

The process:

  • Upstream SQL node completes and generates an outputs parameter containing query results.

  • Data push node retrieves the outputs parameter via Input and Output Parameters and binds it as a node input parameter.

  • Configure and send by referencing input parameters in your message, setting the destination, and pushing.

Supported upstream SQL nodes:

Prerequisites

Limitations

  • Size and rate Limits:

    Destination

    Limit

    DingTalk

    Max 20KB per message.

    Lark

    Max 20KB per message; images must be under 10MB.

    WeCom

    Max 20 messages per minute per bot.

    Teams

    Max 28KB per message.

    Email

    One email body per task; see the SMTP limits from your email provider for additional restrictions.

  • Supported regions for data push:

    China (Hangzhou), China (Shanghai), China (Beijing), China (Shenzhen), China (Chengdu), China (Hong Kong), Singapore, US (Silicon Valley), US (Virginia).

Access Data Studio

  1. Go to the Workspaces page in the DataWorks console. In the top navigation bar, select a desired region. Find the desired workspace and choose Shortcuts > Data Studio in the Actions column.

  2. In the left navigation pane, click the image icon.

Create your workflow

  1. Create a scheduled workflow.

  2. Configure the SQL query node and add your SQL query code.

    Note

    Data push nodes can't directly access ODPS SQL. To query MaxCompute data, create an assignment node instead.

  3. Set up Scheduling on the right side of the node by selecting your Computing Resource, Resource Group, and selecting upstream nodes for Same-cycle Dependency (or choose Use Workspace Root Node if none exist). Configure output parameters:

    Node type

    Configuration

    Assignment node

    Output parameters are created automatically; no setup needed

    Other SQL nodes

    Manually add output parameters: Go to Scheduling > Input and Output Parameters > Node Output Parameters then click Add Assignment Parameter.

  4. Create a data push node and configure it as the downstream node.

    1. Click Scheduling, and in Scheduling Dependencies > Same-cycle Dependency, click Add and select your SQL query node as the upstream dependency.

    2. In Scheduling Policies > Resource Group, select the created serverless resource group.

    3. In Input and Output Parameters > Input Parameters, click Create Parameter. Bind the output parameter from the SQL node as the input parameter for the data push node.

Configure your push

Set up destinations

  1. In the data push node, select Destination. If your destination doesn't exist yet, click Create Destination or configure push targets in DataService Studio.

    Note

    To modify existing push targets, go to DataService Studio > Data Push .

  2. Choose your destination type:

For DingTalk, Lark, WeCom, or Teams

Parameter

Description

Destination

Select DingTalk, Lark, WeCom, or Teams.

Destination Name

Choose a descriptive name.

Webhook

Paste the webhook URL from your messaging platform:

For email

Parameter

Description

Destination

Select Email.

SMTP Host

Your email server address.

SMTP Port

Server port (default: 465).

Sender Address

Email address used to send messages.

Sender Nickname

Optional display name.

SMTP Account

Login username for your SMTP server.

SMTP Password

Authentication password.

Receiver Address

Email recipients (comma-separated for multiple addresses).

Configure your message

You can format content using Markdown, tables, or email bodies:

  • Markdown: Reference input parameters using ${parameter_name} placeholders

  • Tables: Use field names from upstream SQL nodes as Parameters

  • Email bodies:

    • Limit one email body per data push task.

    • Email bodies render only when the destination is email (hidden in webhook messages).

Test

After configuring and saving both nodes, click the image icon in the toolbar to test your data push task.

What to do next

  • Node scheduling: To run nodes on a recurring schedule, configure the Scheduling policies in Scheduling.

  • Node deployment: To run tasks in production, click the image icon to start the deployment process. Nodes only execute on schedule after deploying to production.