All Products
Search
Document Center

DataWorks:Manage triggers

Last Updated:Dec 25, 2025

Traditional recurring schedules can waste resources and cause data processing delays. This can happen when data processing depends on unpredictable external events, such as new file uploads to OSS or new messages in Kafka. These schedules rely on fixed-time polling. DataWorks triggers are designed for these scenarios. They listen for external event signals in real time to start associated data pipelines on demand. This enables event-driven data pipelines, which improves automation and the timeliness of data processing. This topic describes how to create, configure, and manage event-driven scheduling triggers.

Function introduction

A DataWorks trigger is a component for event-driven scheduling that monitors predefined events. When an event occurs, the trigger starts the associated event-triggered workflow and passes the event information to the workflow as parameters.

  • Multi-source event monitoring: You can configure a trigger to monitor events from different sources.

    • External storage events: Monitor the creation of new files in an OSS bucket.

    • External message events: Monitor the arrival of new messages in message queues such as Kafka and RocketMQ.

  • Dynamic parameter capture and passing: When a trigger detects an event, it captures the event's context information and passes this information to downstream nodes as parameters.

    • For OSS events, the captured information includes file details.

    • For message queue events, the captured information includes the full message content, such as the key, value, and headers.

    The inner nodes of the triggered workflow can reference these parameters during execution. This allows data processing to be based on the event content.

  • Supported types:

    • Message queues: Kafka, RocketMQ, and RabbitMQ.

    • Storage objects: OSS.

Applicability

  • Regions: This feature is available only in the China (Hangzhou), China (Beijing), China (Zhangjiakou), China (Ulanqab), China (Shenzhen), China (Hong Kong), and Singapore regions.

  • Editions: This feature is available only in DataWorks Professional Edition and later. If your current edition does not support this feature, you can upgrade DataWorks to the Professional Edition or a higher edition.

Billing

If a trigger-based workflow is configured to run based on an event, EventBridge charges fees in addition to the fees for scheduled task runs. For more information about the fees, see Billing rules for event streams. Billing is based on the number of events.

Preparations

Go to the trigger management page

  1. Go to the Operation Center page.

    Log on to the DataWorks console. In the top navigation bar, select the desired region. In the left-side navigation pane, choose Data Development and O&M > Operation Center. On the page that appears, select the desired workspace from the drop-down list and click Go to Operation Center.

  2. In the navigation pane on the left of the Operation Center page, choose Other > Tenant Schedule Setting, and then click the Trigger Management tab.

Create a trigger

If you have the Developer, O&M, or Administrator role for the workspace, you can create a trigger on the Trigger Management tab.

  1. Click Create Trigger to go to the Create Trigger configuration page.

    Note
  2. Configure the parameters as described in the following tables.

    ApsaraMQ for Kafka

    Parameter

    Configuration description

    Workspace

    Select the workspace where the trigger can be used. Only workspaces that support the new Data Studio are available.

    Applicable Environment

    Triggers are applicable only to the production environment. When running in the development environment, you must manually enter the parameters.

    Owner

    Select an owner for the trigger from the drop-down list.

    Trigger Type

    Select ApsaraMQ for Kafka.

    Trigger Event

    Supports the alikafka:Topic:Message triggering event type. This event is triggered by a Kafka message. For information about how to send messages, see Send and receive Kafka messages.

    Kafka Instance

    Select a Kafka instance in the same region as your workspace. If no instances are available, go to Purchase a Kafka instance to create one.

    Topic

    Specify the Topic that the trigger listens to. If no topics are available, create one: Create Topic.

    Key

    You can preset a key for the message. The task is triggered only if the key matches exactly. This parameter is optional. If you leave it empty, any message will trigger the workflow.

    ConsumerGroupId

    You can select Quick Create or Use Existing. If you select Quick Create, the system automatically creates a group ID with a generated name.

    Message Format Example

    A fixed example of a Kafka message. In the triggered workflow,

    you can use ${workflow.triggerMessage} to get the full message body. You can also use ${workflow.triggerMessage.xxx} to get the value of a specific field in the message body.

    ApsaraMQ for RocketMQ

    Parameter

    Configuration description

    Workspace

    Select the workspace where the trigger can be used. Only workspaces that support the new Data Studio are available.

    Applicable Environment

    Triggers are applicable only to the production environment. When running in the development environment, you must manually enter the parameters.

    Owner

    Select an owner for the trigger from the drop-down list.

    Trigger Type

    Select Message Queue For Apache RocketMQ.

    Note

    Versions earlier than 5.x are not supported. Version 5.x is used by default.

    Trigger Event

    Supports the mq:Topic:SendMessage triggering event, which is triggered by consuming RocketMQ messages.

    ApsaraMQ for RocketMQ Instance

    Select a RocketMQ instance in the same region as the workspace. If no active instances are available, create one in RocketMQ Instance Management.

    Topic

    Specifies the Topic object that the trigger listens to. If no topics are available, go to Topic Management to create one.

    Tag

    You can preset a tag for the message. The task is triggered only if the tag matches exactly. This parameter is optional. If you leave it empty, any message will trigger the workflow.

    Security Group

    If the RocketMQ instance is a subscription instance, you can select a security group.

    Consumer Group

    You can select Quick Create or Use Existing. If you select Quick Create, the system automatically creates a group ID with a generated name.

    Message Format Example

    A fixed example of a RocketMQ message. In the triggered workflow,

    you can use ${workflow.triggerMessage} to get the full message body. You can also use ${workflow.triggerMessage.xxx} to get the value of a specific field in the message body.

    ApsaraMQ for RabbitMQ

    Parameter

    Configuration description

    Workspace

    Select the workspace where the trigger can be used. Only workspaces that support the new Data Studio are available.

    Applicable Environment

    Triggers are applicable only to the production environment. When running in the development environment, you must manually enter the parameters.

    Owner

    Select an owner for the trigger from the drop-down list.

    Trigger Type

    Select ApsaraMQ for RabbitMQ.

    Trigger Event

    Supports the amqp:Queue:SendMessage triggering event type, which is triggered by consuming RabbitMQ messages.

    ApsaraMQ for RabbitMQ Instance

    Select a RabbitMQ instance in the same region as the workspace. If no active instances are available, create one: Create a RabbitMQ instance.

    Vhost

    The name of the RabbitMQ virtual host, which is used to logically isolate queues. If you do not have a Vhost, see Vhost management to create one.

    Queue

    Specifies the Queue object that the trigger listens to. If you do not have a suitable queue, see Queue Management to create one.

    Message Format Example

    An example of a RabbitMQ message body. In the triggered workflow,

    you can use ${workflow.triggerMessage} to get the full message body. You can also use ${workflow.triggerMessage.xxx} to get the value of a specific field in the message body.

    OSS

    Parameter

    Configuration description

    Workspace

    Select the workspace where the trigger can be used. Only workspaces that support the new Data Studio are available.

    Applicable Environment

    Triggers are applicable only to the production environment. When running in the development environment, you must manually enter the parameters.

    Owner

    Select an owner for the trigger from the drop-down list.

    Trigger Type

    Object Storage Service (OSS).

    Trigger Event

    The following three event types are supported:

    Bucket Name

    From the drop-down list, select the name of the OSS bucket to use as the event source. If you have not created a bucket, you can create an OSS bucket.

    File Name

    Specify the name of the file that triggers the event. Wildcard characters are supported:

    • File prefix match:

      • Example: task*.

      • Description: Uploading a file with the task prefix to OSS, such as task10.txt, triggers the event.

    • File suffix match:

      • Example: *task.txt.

      • Description: Uploading a file with the task.txt suffix to OSS, such as work_task.txt, triggers the event.

    • Flexible match:

      • Example: *task*.

      • Description: Uploading a file whose name contains task, such as work_task.txt, to OSS triggers the event.

  3. Click OK to create the trigger.

Use a trigger

A trigger must be used with an event-triggered workflow. Only event-triggered workflows that are published to Operation Center can be run by a trigger.

To implement event-driven scheduling, you must associate a trigger with an event-triggered workflow. After you submit an event-triggered workflow to Operation Center, the tasks in the workflow are automatically triggered when a monitored event occurs.

  1. Create a triggered workflow.

  2. When you create a trigger-based workflow, configure the trigger that you created in the Scheduling Policy section.

  3. The inner nodes of an event-triggered workflow are configured in the same way as those of a normal workflow.

    The core difference is the execution mechanism. An event-triggered workflow does not rely on a recurring schedule. Instead, it is driven by a trigger that responds to external events.

Manage triggers

On the Trigger Management tab, you can find a trigger to view its referenced tasks, modify it, view its versions, or perform a rollback.

  • View Referenced Tasks: If a trigger is referenced by an event-triggered workflow, you can click View Referenced Tasks in the Actions column. On the View Referenced Tasks page, you can view the event-triggered workflows that reference the trigger.

  • Modify a trigger: Click Modify in the Actions column. On the Modify Trigger page, edit the trigger information and click Confirm.

    Note

    After you modify a trigger, the system automatically creates a new version.

  • View Versions:

    1. Click Versions in the Actions column. On the View Versions page, you can view all historical versions of the trigger.

    2. You can click View in the Actions column to see the details of a specific version.

    3. Rollbacks are supported. To roll back to a previous version, click Roll Back next to that version. In the dialog box that appears, enter a comment in the Remark field and click OK.

      Note

      When you perform a rollback, the system automatically creates a new version based on the historical version that you selected.

  • Delete a trigger: Before you can delete a trigger, you must ensure that all tasks that reference it are unpublished and deleted. Then, click Delete and click Confirm in the confirmation dialog box.

More operations

After you create and configure a trigger, you can use it in an event-triggered workflow. For more information, see Event-triggered workflows.