All Products
Search
Document Center

DataWorks:Debugging procedure

Last Updated:Mar 26, 2026

DataStudio provides three debugging methods — Run, Run with Parameters, and quick run — to validate complete node code or a specific code snippet before scheduling.

You are not charged for scheduling resources during debugging, but you are charged for the compute engine instances used. For billing details, see the documentation of the related Alibaba Cloud services.

Prerequisites

Before you begin, ensure that you have:

Debug a node

  1. Open DataStudio. In the Scheduled Workflow or Manually Triggered Workflows pane on the left, double-click the node to open its configuration tab.

  2. Choose a debugging method based on what you need to test: To debug the complete code, click the Run or Run with Parameters icon in the top toolbar. To debug a snippet, click the Quick run icon to the left of the target code line.

    If you do not have permission to access the data you want to query, see Permission management for data in compute engine instances.
    MethodWhat it doesWhen to use
    RunAssigns values to variables and specifies a resource group for scheduling. The first time you run a node, the Parameters dialog box appears — assign constants to variables, and DataWorks saves these values for subsequent runs.Frequently debugging the same node
    Run with ParametersRequires you to assign constants to variables and specify a resource group each time you run.Changing variable values or the resource group between runs
    Quick runRuns only the selected code snippet in the code section of the configuration tab. Verifies snippet correctness only — for complete code debugging, use Run or Run with Parameters.Verifying a specific code snippet without running the entire node

Debug a workflow

  1. Open DataStudio. In the Scheduled Workflow or Manually Triggered Workflows pane on the left, double-click the workflow to open its configuration tab.

  2. Click the Run icon in the top toolbar. Nodes in the workflow run in sequence based on their dependencies.

    Note
    • For manually triggered workflows where multiple nodes share the same variable name, configure workflow parameters on the configuration tab to assign values in one place. After the run, the tab shows each node's value assignments and run details.

    • Workflow parameters can be configured only for specific node types.

    • To view a node's run log after the workflow completes, right-click the node on the configuration tab and select View Log.

View operating history

The Operating history pane on the DataStudio page lists all nodes run within the last three days under your current account.

Even if you close a node tab while it is running, the node continues to run on the compute engine. Go to the Operating history pane to view run logs or stop the node.

Use ad hoc query

Use ad hoc query to query data and related SQL code, test code, or verify code correctness in the development environment.

If you do not have permission to access the data you want to query, see Permission management for data in compute engine instances.

Process query results

After SQL code runs, you can analyze, share, or download the results.

OperationDescriptionReference
Analyze dataSync query results to a workbook for further analysis.Analyze data
Share dataSync results to a workbook and share them with specific users using the workbook's data sharing feature.Share data
Download dataDownload results as a workbook to your local machine. Up to 10,000 records are displayed by default.Download data

Access control for query results

The tenant administrator, tenant security administrator, and Resource Access Management (RAM) users with the Workspace Manager role can go to the Data query and analysis control page to:

  • Set the maximum number of SQL query result records that can be viewed or downloaded

  • Control whether users can download data

For role assignment details, see Add a RAM user to a workspace as a member and assign roles to the member.

Edition requirements

The download feature requires a DataWorks advanced edition. To upgrade, see Differences among DataWorks editions.