All Products
Search
Document Center

CloudFlow:EventBridge event scheduling

Last Updated:Sep 14, 2024

EventBridge event scheduling includes event scheduling of cloud services and event scheduling that has custom event sources. Event scheduling of cloud services supports event sources of almost all Alibaba Cloud services in the sectors of elastic computing, storage services, databases, containers, big data processing, observability services, and middleware services. Event scheduling that has custom event sources supports event sources in Simple Log Service, ApsaraMQ for Kafka, ApsaraMQ for RocketMQ, and ApsaraMQ for RabbitMQ. This topic describes how to create EventBridge event scheduling. This topic also describes advanced features of workflow scheduling.

Create EventBridge event scheduling

Event scheduling of cloud services

Event scheduling of cloud services uses Alibaba Cloud service events to implement workflow scheduling. The events include CloudMonitor events, audit events, Elastic Compute Service (ECS) events, Alibaba Cloud IoT events, and O&M events of specific Alibaba Cloud services. In the following example, ECS is used to describe how to create event scheduling of Alibaba Cloud services in the CloudFlow console.

Feature description

After you submit a request to create EventBridge workflow scheduling in the CloudFlow console, an event rule named rule-created-by-fnf-<random string> is automatically created in the event bus named default of the Alibaba Cloud service based on the configurations of the workflow scheduling. After the EventBridge workflow scheduling is created, you can view the workflow scheduling information on the workflow details page of the CloudFlow console. You can also view information about the automatically created event rule in the EventBridge console. When an event of the type that is specified in the event source is delivered to the event bus, the workflow that is associated with the workflow scheduling is scheduled to execute once.

Precautions

You can create up to 10 event rules in the default event bus, which is designed for cloud services.

Prerequisites

Create event scheduling of cloud services

  1. Log on to the CloudFlow console. In the top navigation bar, select a region.

  2. In the left-side navigation pane, click Workflows. On the Workflows page, click the workflow that you want to manage.

  3. On the details page of the workflow, click the Workflow Scheduling tab and click Create Workflow Scheduling.

  4. In the Create Workflow Scheduling panel, set the Scheduling Type parameter to Elastic Compute Service (ECS), configure the parameters, and then click OK. The following table describes the basic parameters.

    Parameter

    Description

    Example

    Name

    Enter a custom name for the workflow scheduling.

    ecs-schedule

    Event Type

    Select Custom or All. If you select Custom, you can select one or more event types of ECS.

    Custom and ecs:Disk:ConvertToPostpaidCompleted

    Event Mode Content

    After you configure the Event Type parameter, the event mode content is automatically populated. You cannot modify the content. For more information about event modes, see Event patterns.

    {
     "source": [
     "acs.ecs"
     ],
     "type": [
     "ecs:Disk:ConvertToPostpaidCompleted"
     ]
    }

    For information about advanced parameters, such as the Push Settings, Retry Policy, and Dead-letter Queue parameters, see Advanced features of workflow scheduling. After you create the workflow scheduling, you can follow the on-screen instructions to view, edit, delete, enable, and disable the workflow scheduling on the Workflow Scheduling tab of the workflow details page.

HTTP workflow scheduling

After HTTP requests that serve as event sources are integrated with CloudFlow by using EventBridge, HTTP workflow scheduling can trigger the execution of associated workflows. This section describes how to create HTTP workflow scheduling in the CloudFlow console.

Precautions

If the number of existing custom buses and event rules reaches the upper limit, you can no longer create HTTP workflow scheduling of the event mode. For information about the limits on the number of resources that are used to create workflow scheduling in a single Alibaba Cloud account and a single region, see Limits.

Prerequisites

Create HTTP workflow scheduling

  1. Log on to the CloudFlow console. In the top navigation bar, select a region.

  2. In the left-side navigation pane, click Workflows. On the Workflows page, click the workflow that you want to manage.

  3. On the details page of the workflow, click the Workflow Scheduling tab and click Create Workflow Scheduling.

  4. In the Create Workflow Scheduling panel, set the Scheduling Type parameter to Triggered by HTTP/HTTPS Requests, configure the parameters, and then click OK. The following table describes the basic parameters.

    Parameter

    Description

    Example

    Name

    Enter a custom name for the workflow scheduling.

    https-schedule

    Request Type

    Select HTTPS, HTTP, or HTTP&HTTPS.

    HTTP

    Request Method

    Select one or more supported HTTP request methods. Valid values:

    • GET

    • POST

    • PUT

    • DELETE

    • HEAD

    • PATCH

    GET

    Security Settings

    Select a security setting type. Valid values:

    • N/A: No security settings are in effect. All received URL requests can trigger the execution of the workflow.

    • CIDR Block: Enter one or more IP addresses or CIDR blocks from which requests are allowed. Only URL requests that use the specified IP addresses or the IP addresses within the specified CIDR blocks can trigger the execution of the workflow. You can add up to five IP addresses or CIDR blocks.

    • Secure Domain Name: Enter one or more secure domain names. Only URL requests that use the secure domain names can trigger the execution of the workflow. You can enter up to five secure domain names.

    CIDR Block: 10.45.12.0/24

    For information about advanced parameters, such as the Push Settings, Retry Policy, and Dead-letter Queue parameters, see Advanced features of workflow scheduling. After you create the workflow scheduling, you can follow the on-screen instructions to view, edit, delete, enable, and disable the workflow scheduling on the Workflow Scheduling tab of the workflow details page.

Simple Message Queue (formerly MNS) workflow scheduling

Precautions

  • The Simple Message Queue (formerly MNS) queue that serves as the event source must reside in the same region as the workflow for which you create the workflow scheduling.

  • If the number of existing event streams reaches the upper limit, you can no longer create Simple Message Queue (formerly MNS) scheduling of the event mode. For information about the limits on the number of resources that are used to create workflow scheduling in a single Alibaba Cloud account and a single region, see Limits.

Prerequisites

Create workflow scheduling

  1. Log on to the CloudFlow console. In the top navigation bar, select a region.

  2. In the left-side navigation pane, click Workflows. On the Workflows page, click the workflow that you want to manage.

  3. On the details page of the workflow, click the Workflow Scheduling tab and click Create Workflow Scheduling.

  4. In the Create Workflow Scheduling panel, set the Scheduling Type parameter to Message Service (MNS), configure the parameters, and then click OK. The following table describes the basic parameters.

    Parameter

    Description

    Example

    Name

    Enter a custom name for the workflow scheduling.

    mns-schedule

    Queue Name

    Select an Simple Message Queue (formerly MNS) queue.

    MyQueue

    Base64 Decoding

    If you want to decode Simple Message Queue (formerly MNS) data before you deliver it, select Enable Base64 Decoding.

    Enable Base64 Decoding

    For information about advanced parameters, such as the Push Settings, Retry Policy, and Dead-letter Queue parameters, see Advanced features of workflow scheduling. After you create the workflow scheduling, you can follow the on-screen instructions to view, edit, delete, enable, and disable the workflow scheduling on the Workflow Scheduling tab of the workflow details page.

Kafka workflow scheduling

Precautions

  • The ApsaraMQ for Kafka instance that serves as the event source must reside in the same region as the workflow for which you create the workflow scheduling.

  • If the number of existing event streams reaches the upper limit, you can no longer create Kafka workflow scheduling. For information about the limits on the number of resources that are used to create workflow scheduling in a single Alibaba Cloud account and a single region, see Limits.

Prerequisites

Create Kafka workflow scheduling

  1. Log on to the CloudFlow console. In the top navigation bar, select a region.

  2. In the left-side navigation pane, click Workflows. On the Workflows page, click the workflow that you want to manage.

  3. On the details page of the workflow, click the Workflow Scheduling tab and click Create Workflow Scheduling.

  4. In the Create Workflow Scheduling panel, set the Scheduling Type parameter to ApsaraMQ for Kafka, configure the parameters, and then click OK. The following table describes the basic parameters.

    Parameter

    Description

    Example

    Name

    Enter a custom name for the workflow scheduling.

    kafka-schedule

    Kafka Instance

    Select an ApsaraMQ for Kafka instance.

    alikafka_pre-cn-i7m2t7t1****

    Topic

    Select a topic of the ApsaraMQ for Kafka instance.

    topic1

    Group ID

    Select a group ID of the ApsaraMQ for Kafka instance.

    Important

    You must use a separate group ID to create workflow scheduling. Do not use an existing group ID of another messaging service. If you use an existing group ID of another messaging service, you may fail to send and receive messages by using another messaging service.

    GID_group1

    Concurrent Consumption Tasks

    Specify the number of concurrent consumers. Valid values: 1 to <number of partitions in the topic>.

    2

    Consumer Offset

    Select the offset from which ApsaraMQ for Kafka starts to pull messages from EventBridge. Valid values:

    • Earliest Offset: pulls messages from the earliest offset.

    • Latest Offset: pulls messages from the latest offset.

    Latest Offset

    Network Settings

    Select the type of the network over which you want to route messages. Valid values:

    • Default Network: automatically uses the VPC ID and vSwitch ID that are specified when the ApsaraMQ for Kafka instance is deployed.

    • Internet: You must specify the Virtual Private Cloud (VPC), vSwitch, and Security Group parameters.

    Default Network

    For information about advanced parameters, such as the Push Settings, Retry Policy, and Dead-letter Queue parameters, see Advanced features of workflow scheduling. After you create the workflow scheduling, you can follow the on-screen instructions to view, edit, delete, enable, and disable the workflow scheduling on the Workflow Scheduling tab of the workflow details page.

RocketMQ workflow scheduling

Precautions

  • The ApsaraMQ for RocketMQ instance that serves as the event source must reside in the same region as the workflow for which you create the workflow scheduling.

  • If the number of existing event streams reaches the upper limit, you can no longer create RocketMQ workflow scheduling of the event mode. For information about the limits on the number of resources that are used to create workflow scheduling in a single Alibaba Cloud account and a single region, see Limits.

Prerequisites

Create RocketMQ workflow scheduling

  1. Log on to the CloudFlow console. In the top navigation bar, select a region.

  2. In the left-side navigation pane, click Workflows. On the Workflows page, click the workflow that you want to manage.

  3. On the details page of the workflow, click the Workflow Scheduling tab and click Create Workflow Scheduling.

  4. In the Create Workflow Scheduling panel, set the Scheduling Type parameter to ApsaraMQ for RocketMQ, configure the parameters, and then click OK. The following table describes the basic parameters.

    Parameter

    Description

    Example

    Name

    Enter a custom name for the workflow scheduling.

    rocketmq-schedule

    ApsaraMQ for RocketMQ Instance

    Select an ApsaraMQ for RocketMQ instance.

    MQ_INST_164901546557****_BX7****

    Topic

    Select a topic of the ApsaraMQ for RocketMQ instance.

    topic1

    Tag

    Specify the tag that is used to filter messages. Only when a message that contains the filter label string is received, CloudFlow triggers the execution of the workflow.

    tag

    Group ID

    Select a group ID of the ApsaraMQ for RocketMQ instance. We recommend that you select Create. Then, the system automatically creates a group ID named GID_FNF_TRIGGER_{uuid}_{timestamp}.

    Important

    You must use a separate group ID to create workflow scheduling. Do not use an existing group ID of another messaging service. If you use an existing group ID of another messaging service, you may fail to send and receive messages by using another messaging service.

    GID_group1

    Consumer Offset

    Select the offset from which ApsaraMQ for RocketMQ starts to pull messages from EventBridge. Valid values:

    • Latest Offset: pulls messages from the latest offset.

    • Earliest Offset: pulls messages from the earliest offset.

    • Timestamp: pulls messages from the specified timestamp.

    Latest Offset

    For information about advanced parameters, such as the Push Settings, Retry Policy, and Dead-letter Queue parameters, see Advanced features of workflow scheduling. After you create the workflow scheduling, you can follow the on-screen instructions to view, edit, delete, enable, and disable the workflow scheduling on the Workflow Scheduling tab of the workflow details page.

RabbitMQ workflow scheduling

Precautions

  • The ApsaraMQ for RabbitMQ instance that is used as the event source must reside in the same region as the workflow for which you create the workflow scheduling.

  • If the number of existing event streams reaches the upper limit, you can no longer create RabbitMQ workflow scheduling of the event mode. For information about the limits on the number of resources that are used to create workflow scheduling in a single Alibaba Cloud account and a single region, see Limits.

Prerequisites

Create RabbitMQ workflow scheduling

  1. Log on to the CloudFlow console. In the top navigation bar, select a region.

  2. In the left-side navigation pane, click Workflows. On the Workflows page, click the workflow that you want to manage.

  3. On the details page of the workflow, click the Workflow Scheduling tab and click Create Workflow Scheduling.

  4. In the Create Workflow Scheduling panel, set the Scheduling Type parameter to ApsaraMQ for RabbitMQ, configure the parameters, and then click OK. The following table describes the basic parameters.

    Parameter

    Description

    Example

    Name

    Enter a custom name for the workflow scheduling.

    rabbitmq-schedule

    ApsaraMQ for RabbitMQ Instance

    Select an ApsaraMQ for RabbitMQ instance.

    amqp-cn-i7m2l6m2****

    Vhost

    Select a vhost on the ApsaraMQ for RabbitMQ instance.

    myhost-1

    Queue

    Select a queue of the ApsaraMQ for RabbitMQ instance.

    myqueue-1

    For information about advanced parameters, such as the Push Settings, Retry Policy, and Dead-letter Queue parameters, see Advanced features of workflow scheduling. After you create the workflow scheduling, you can follow the on-screen instructions to view, edit, delete, enable, and disable the workflow scheduling on the Workflow Scheduling tab of the workflow details page.

Simple Log Service workflow scheduling

Simple Log Service workflow scheduling connects Simple Log Service to CloudFlow. When a new log is generated, the execution of the workflow is triggered to process the log. This section describes how to create Simple Log Service workflow scheduling in the CloudFlow console.

Precautions

  • The Simple Log Service project that is used as the event source must reside in the same region as the workflow for which you create the workflow scheduling.

  • If the number of existing event streams reaches the upper limit, you can no longer create Simple Log Service workflow scheduling of the event mode. For information about the limits on the number of resources that are used to create workflow scheduling in a single Alibaba Cloud account and a single region, see Limits.

Prerequisites

Create Simple Log Service workflow scheduling

  1. Log on to the CloudFlow console. In the top navigation bar, select a region.

  2. In the left-side navigation pane, click Workflows. On the Workflows page, click the workflow that you want to manage.

  3. On the details page of the workflow, click the Workflow Scheduling tab and click Create Workflow Scheduling.

  4. In the Create Workflow Scheduling panel, set the Scheduling Type parameter to Log Service, configure the parameters, and then click OK. The following table describes the basic parameters.

    Parameter

    Description

    Example

    Name

    Enter a custom name for the workflow scheduling.

    sls-schedule

    Project

    Select an existing Simple Log Service project.

    test-Project

    Logstore

    Select an existing Simple Log Service Logstore.

    test-LogStore

    Starting Consumer Offset

    The offset from which CloudFlow pulls messages. Valid values: Earliest Offset, Latest Offset, and Timestamp.

    Latest Offset

    Configurations of Log Service Role

    Select a Simple Log Service role. EventBridge assumes the role to read Simple Log Service logs.

    testRole

    For information about advanced parameters, such as the Push Settings, Retry Policy, and Dead-letter Queue parameters, see Advanced features of workflow scheduling. After you create the workflow scheduling, you can follow the on-screen instructions to view, edit, delete, enable, and disable the workflow scheduling on the Workflow Scheduling tab of the workflow details page.

MQTT workflow scheduling

After ApsaraMQ for MQTT instances that serve as event sources are integrated with CloudFlow by using EventBridge, ApsaraMQ for MQTT workflow scheduling can trigger the execution of associated workflows. Then, CloudFlow processes messages that are published to ApsaraMQ for MQTT based on your custom configurations. This section describes how to create MQTT workflow scheduling in the CloudFlow console.

Precautions

  • The ApsaraMQ for MQTT instance that is used as the event source must reside in the same region as the workflow for which you create the workflow scheduling.

  • If the number of existing event streams reaches the upper limit, you can no longer create MQTT workflow scheduling of the event mode. For information about the limits on the number of resources that are used to create workflow scheduling in a single Alibaba Cloud account and a single region, see Limits.

Prerequisites

Create MQTT workflow scheduling

  1. Log on to the CloudFlow console. In the top navigation bar, select a region.

  2. In the left-side navigation pane, click Workflows. On the Workflows page, click the workflow that you want to manage.

  3. On the details page of the workflow, click the Workflow Scheduling tab and click Create Workflow Scheduling.

  4. In the Create Workflow Scheduling panel, set the Scheduling Type parameter to ApsaraMQ for MQTT, configure the parameters, and then click OK. The following table describes the basic parameters.

    Parameter

    Description

    Example

    Parameter

    Enter a custom name for the workflow scheduling.

    mqtt-schedule

    ApsaraMQ for MQTT Instance

    Select an ApsaraMQ for MQTT instance.

    mqtt-xxx

    MQTT Topic

    Select a topic of the ApsaraMQ for MQTT instance.

    testTopic

    For information about advanced parameters, such as the Push Settings, Retry Policy, and Dead-letter Queue parameters, see Advanced features of workflow scheduling. After you create the workflow scheduling, you can follow the on-screen instructions to view, edit, delete, enable, and disable the workflow scheduling on the Workflow Scheduling tab of the workflow details page.

DTS workflow scheduling

After Data Transmission Service (DTS) instances that serve as event sources are integrated with CloudFlow by using EventBridge, DTS workflow scheduling can trigger the execution of associated workflows. This topic describes how to create DTS workflow scheduling in the CloudFlow console.

Precautions

  • The DTS change tracking task that is used as the event source must reside in the same region as the workflow for which you create the workflow scheduling.

  • If the number of existing event streams reaches the upper limit, you can no longer create DTS workflow scheduling of the event mode. For information about the limits on the number of resources that are used to create workflow scheduling in a single Alibaba Cloud account and a single region, see Limits.

Prerequisites

Create DTS workflow scheduling

  1. Log on to the CloudFlow console. In the top navigation bar, select a region.

  2. In the left-side navigation pane, click Workflows. On the Workflows page, click the workflow that you want to manage.

  3. On the details page of the workflow, click the Workflow Scheduling tab and click Create Workflow Scheduling.

  4. In the Create Workflow Scheduling panel, set the Scheduling Type parameter to Data Transmission Service (DTS), configure the parameters, and then click OK. The following table describes the basic parameters.

    Parameter

    Description

    Example

    Parameter

    Enter a custom name for the workflow scheduling.

    dts-schedule

    Change Tracking Task

    Select a change tracking task.

    dtsqntc2***

    Consumer Group

    Select a consumer group that is used to consume the change tracking task.

    test

    Account

    Enter the account name that you specified when you created the consumer group.

    test

    Password

    Enter the account password that you specified when you created the consumer group.

    *******

    Consumer Offset

    Specify the timestamp from which the first piece of data is pulled. The consumer offset must be within the dataset of the DTS instance.

    2022-06-21 00:00:00

    For information about advanced parameters, such as the Push Settings, Retry Policy, and Dead-letter Queue parameters, see Advanced features of workflow scheduling. After you create the workflow scheduling, you can follow the on-screen instructions to view, edit, delete, enable, and disable the workflow scheduling on the Workflow Scheduling tab of the workflow details page.

Advanced features of workflow scheduling

Pushing formats

A pushing format is used to specify the format of each data element in the Event parameter.

  • CloudEvents: a specification for describing event data in a common format. CloudEvents simplifies event declaration and transmission between different services and platforms.

  • RawData: Only the content of data fields in CloudEvents is delivered. Other metadata in the CloudEvents format is not delivered.

Batch push

After you enable batch push, you must specify the number of batch push messages and a batch push interval.

  • Number of batch push messages: the maximum number of messages that can be sent at a time in a function invocation. A request is sent only when the number of messages in the backlog reaches the specified value. Valid values: 1 to 10000.

  • Batch push interval: the interval at which a function is invoked. The system aggregates messages and sends the messages to CloudFlow at the specified time interval. Valid values: 0 to 15. Unit: seconds. A value of 0 indicates that the messages are sent in real time.

Batch push examples:

  • Example 1:

    You set the number of batch push messages to 10, the size of each message to 1 KB, and the batch push interval to 15s. The number of messages reaches 10 within 10s. In this case, the messages are immediately sent without waiting for 15 seconds.

  • Example 2:

    You set the number of batch push messages to 32, the size of each message to 1 KB, and the batch push interval to 15s. The number of messages reaches 10 within 15s. In this case, the messages are immediately sent without waiting for 32 messages to be aggregated.

  • Example 3:

    You set the number of batch push messages to 20, the size of each message to 2 KB, and the batch push interval to 15s. The number of messages reaches 40 within 10s. The total size of the messages is 80 KB (40 × 2 KB), which reaches the input size limit 64 KB of workflows. In this case, the messages are immediately sent. The first 32 messages are pushed to CloudFlow in the first batch, and the remaining 8 messages are pushed to CloudFlow in the second batch.

Retry policy

If a message fails to be pushed, it can be retried to push based on the retry policy that you configured. The following items describe the available retry policies:

  • Backoff retry: The system retries the request up to three times. The interval between two consecutive retries is random and ranges from 10 to 20s.

  • Exponential decay retry: The default retry policy. The system retries the request up to 176 times. The interval between two consecutive retries exponentially increases to 512s. The retries are performed within 24 hours. The retry intervals are 1s, 2s, 4s, 8s, 16s, 32s, 64s, 128s, 256s, and 512s. The interval of 512s is used for 167 times.

Fault tolerance policy

A fault tolerance policy specifies whether to tolerate an error when it occurs.

  • Fault tolerance allowed

    The system skips the requests that failed to be executed after retries and proceeds to the next request.

  • Fault tolerance prohibited

    Consumption is blocked when a request fails to be executed after retries.

Dead-letter queues

You can configure dead-letter queues only when fault tolerance allowed is enabled.

  • If you enable the dead-letter queue feature, messages that are not processed or exceed the maximum number of retries are delivered to destination services. The following services can be used as destination services in CloudFlow: Simple Message Queue (formerly MNS), ApsaraMQ for RocketMQ, ApsaraMQ for Kafka, and EventBridge. You can select a queue type based on your business requirements.

  • If you do not enable the dead-letter queue feature, messages that exceed the maximum number of retries are discarded.