DataWorks triggers start data pipelines automatically when external events occur, such as new files uploaded to Object Storage Service (OSS) or new messages arriving in a message queue. Instead of polling on a fixed schedule, triggers listen for events in real time through EventBridge and pass event data directly to the triggered workflow.
How it works
A trigger monitors a specific event source, such as an OSS bucket or a Kafka topic. When a matching event occurs:
The event source (OSS, Kafka, RocketMQ, or RabbitMQ) emits an event.
EventBridge routes the event to the DataWorks trigger.
The trigger starts the associated event-triggered workflow.
Event data is passed to the workflow as parameters. Inner nodes access this data through
${workflow.triggerMessage}for the full message body, or${workflow.triggerMessage.xxx}for a specific field.
Supported event sources
| Event source | Event type code | Triggered by |
|---|---|---|
| ApsaraMQ for Kafka | alikafka:Topic:Message | A message published to a monitored Kafka topic |
| ApsaraMQ for RocketMQ | mq:Topic:SendMessage | A message consumed from a monitored RocketMQ topic |
| ApsaraMQ for RabbitMQ | amqp:Queue:SendMessage | A message consumed from a monitored RabbitMQ queue |
| OSS | oss:ObjectCreated:PutObject, oss:ObjectCreated:PostObject, oss:ObjectCreated:CompleteMultipartUpload | A file uploaded to a monitored bucket through simple upload, form upload, or multipart upload |
Event data captured
| Event source | Captured data | Access method |
|---|---|---|
| OSS | File details | ${workflow.triggerMessage.xxx} |
| Message queues (Kafka, RocketMQ, RabbitMQ) | Full message content (key, value, headers) | ${workflow.triggerMessage} for the full body, ${workflow.triggerMessage.xxx} for a specific field |
Limitations
Regions: Available only in the China (Hangzhou), China (Beijing), China (Zhangjiakou), China (Ulanqab), China (Shenzhen), China (Hong Kong), and Singapore regions.
Editions: Requires DataWorks Professional Edition or later. To upgrade, see Differences among DataWorks editions.
Environment: Triggers run only in the production environment. In the development environment, enter trigger parameters manually.
Workspace: Only workspaces that support the new Data Studio are available.
Billing
Triggers rely on EventBridge to route events. In addition to DataWorks scheduled task run fees, EventBridge charges based on the number of events processed. For details, see Billing rules for event streams.
Create a trigger
Before you begin, make sure that you have:
An event source (OSS, ApsaraMQ for Kafka, ApsaraMQ for RocketMQ, or ApsaraMQ for RabbitMQ) in the same region as the DataWorks workspace, with access permissions configured
The Developer, O&M, or Workspace Administrator role assigned to your account in the workspace. For details, see Add a workspace member and manage member role permissions
Step 1: Go to the trigger management page
Log in to the DataWorks console. In the top navigation bar, select the target region.
In the left-side navigation pane, choose Data Development and O&M > Operation Center. Select the target workspace from the drop-down list and click Go to Operation Center.
In the left-side navigation pane of Operation Center, choose Other > Tenant Schedule Setting, and then click the Trigger Management tab.
Step 2: Create the trigger
On the Trigger Management tab, click Create Trigger.
DataWorks triggers rely on EventBridge. If this is your first time using triggers, or the AliyunServiceRoleForDataWorksScheduler service-linked role is missing, click Add to grant the required permissions. The required permission is ram:CreateServiceLinkedRole. For details, see Minimum permissions required to create a service-linked role.
Configure trigger parameters based on the event source type.
ApsaraMQ for Kafka
| Parameter | Description |
|---|---|
| Workspace | The workspace where this trigger is available. Only workspaces that support the new Data Studio are listed. |
| Applicable Environment | Production only. In the development environment, enter parameters manually. |
| Owner | The trigger owner. Select from the drop-down list. |
| Trigger Type | Select ApsaraMQ for Kafka. |
| Trigger Event | alikafka:Topic:Message. Triggered when a Kafka message is published. For details on sending messages, see Send and receive Kafka messages. |
| Kafka Instance | A Kafka instance in the same region as the workspace. If none are available, purchase a Kafka instance. |
| Topic | The topic the trigger listens to. If none are available, create a topic. |
| Key | (Optional) An exact-match filter on the message key. Leave blank to trigger on any message. |
| ConsumerGroupId | Select Quick Create to auto-generate a consumer group, or Use Existing to select one. |
| Message Format Example | A fixed example of a Kafka message. In the triggered workflow, use ${workflow.triggerMessage} for the full message body, or ${workflow.triggerMessage.xxx} for a specific field. |
ApsaraMQ for RocketMQ
Versions earlier than 5.x are not supported. Version 5.x is used by default.
| Parameter | Description |
|---|---|
| Workspace | The workspace where this trigger is available. Only workspaces that support the new Data Studio are listed. |
| Applicable Environment | Production only. In the development environment, enter parameters manually. |
| Owner | The trigger owner. Select from the drop-down list. |
| Trigger Type | Select Message Queue For Apache RocketMQ. |
| Trigger Event | mq:Topic:SendMessage. Triggered when a RocketMQ message is consumed. |
| ApsaraMQ for RocketMQ Instance | A RocketMQ instance in the same region as the workspace. If none are available, create a RocketMQ instance. |
| Topic | The topic the trigger listens to. If none are available, create a topic. |
| Tag | (Optional) An exact-match filter on the message tag. Leave blank to trigger on any message. |
| Security Group | If the RocketMQ instance is a subscription instance, select a security group. |
| Consumer Group | Select Quick Create to auto-generate a consumer group, or Use Existing to select one. |
| Message Format Example | A fixed example of a RocketMQ message. In the triggered workflow, use ${workflow.triggerMessage} for the full message body, or ${workflow.triggerMessage.xxx} for a specific field. |
ApsaraMQ for RabbitMQ
| Parameter | Description |
|---|---|
| Workspace | The workspace where this trigger is available. Only workspaces that support the new Data Studio are listed. |
| Applicable Environment | Production only. In the development environment, enter parameters manually. |
| Owner | The trigger owner. Select from the drop-down list. |
| Trigger Type | Select ApsaraMQ for RabbitMQ. |
| Trigger Event | amqp:Queue:SendMessage. Triggered when a RabbitMQ message is consumed. |
| ApsaraMQ for RabbitMQ Instance | A RabbitMQ instance in the same region as the workspace. If none are available, create a RabbitMQ instance. |
| Vhost | The RabbitMQ virtual host (vhost), used for logical isolation of queues. If none exist, see Vhost management. |
| Queue | The queue the trigger listens to. If none exist, see Queue management. |
| Message Format Example | An example of a RabbitMQ message body. In the triggered workflow, use ${workflow.triggerMessage} for the full message body, or ${workflow.triggerMessage.xxx} for a specific field. |
OSS
| Parameter | Description |
|---|---|
| Workspace | The workspace where this trigger is available. Only workspaces that support the new Data Studio are listed. |
| Applicable Environment | Production only. In the development environment, enter parameters manually. |
| Owner | The trigger owner. Select from the drop-down list. |
| Trigger Type | Select Object Storage Service (OSS). |
| Trigger Event | Select one of the following event types: oss:ObjectCreated:PutObject (simple upload), oss:ObjectCreated:PostObject (form upload), or oss:ObjectCreated:CompleteMultipartUpload (multipart upload). |
| Bucket Name | The OSS bucket to monitor. Select from the drop-down list. If no buckets exist, create a bucket. |
| File Name | The file name pattern that triggers the event. Wildcards are supported. See the table below. |
File name matching patterns
| Pattern | Example | Matches |
|---|---|---|
| Prefix match | task* | Files with the task prefix, such as task10.txt |
| Suffix match | *task.txt | Files ending with task.txt, such as work_task.txt |
| Contains match | *task* | Files containing task in the name, such as work_task.txt |
Step 3: Save the trigger
After configuring all parameters, click OK to create the trigger.
Associate a trigger with a workflow
A trigger must be paired with an event-triggered workflow. Only workflows published to Operation Center can be triggered by events.
In the workflow's Scheduling Policy section, select the trigger you created.
Configure the inner nodes of the workflow the same way as a standard workflow.
Unlike scheduled workflows that run on a fixed cron, event-triggered workflows only run when the associated trigger detects a matching event.
View referenced tasks
On the Trigger Management tab, click View Referenced Tasks in the Actions column to see all event-triggered workflows that reference a trigger.
Modify a trigger
On the Trigger Management tab, click Modify in the Actions column to edit the trigger configuration. After you save changes, a new version is created automatically.
View versions and roll back
On the Trigger Management tab, click Versions in the Actions column to view all historical versions of a trigger.
Click View next to a version to see its details.
To revert to a previous version, click Roll Back next to that version. Enter a comment in the Remark field and click OK.
Rolling back creates a new version based on the selected historical version. The previous versions remain intact.
Delete a trigger
Before deleting a trigger, unpublish and delete all tasks that reference it. On the Trigger Management tab, click Delete in the Actions column and confirm.
Next steps
After creating a trigger, associate it with an event-triggered workflow to build an event-driven data pipeline. For details, see Event-triggered workflows.