All Products
Search
Document Center

EventBridge:Route messages between ApsaraMQ for Kafka instances

Last Updated:Mar 10, 2026

When you need to synchronize data across regions or between different systems such as data warehouses, data processing programs, and data analytics systems, routing messages between ApsaraMQ for Kafka instances through EventBridge event streams eliminates the need to deploy and maintain custom consumers or third-party replication tools. An event stream connects a source Kafka instance to a destination Kafka instance as a lightweight, real-time channel that supports filtering and transformation in transit.

How event streams route Kafka messages

An event stream operates independently of event buses and provides point-to-point message routing with minimal overhead:

  1. A source ApsaraMQ for Kafka instance produces messages to a topic.

  2. The event stream consumes messages from the source topic, optionally filters or transforms them, and forwards them to a topic on the destination ApsaraMQ for Kafka instance.

The following diagram shows the data flow and the event structure that passes through the stream:

Source Kafka instance          EventBridge event stream          Destination Kafka instance
 (topic: orders)  ──────>  [filter / transform] ──────>  (topic: orders-replica)

Each consumed message is wrapped in an event envelope. The $.data.value field contains the Base64-encoded message value, and $.data.key contains the message key. You reference these JSONPath expressions when you configure the sink to extract and decode message content.

Prerequisites

Before you begin, make sure that you have:

Create an event stream

Important

Create the event stream in the region of the destination instance, not the source. For example, to route messages from China (Beijing) to China (Shanghai), create the event stream in the China (Shanghai) region.

  1. Log on to the EventBridge console.

  2. In the top navigation bar, select the region of the destination ApsaraMQ for Kafka instance.

  3. In the left-side navigation pane, click Event Streams.

  4. Click Create Event Stream.

  5. In the Create Event Stream panel, enter a Task Name and Description.

Configure the source

Set Data Provider to ApsaraMQ for Kafka in the Source configuration wizard, configure the following parameters, and click Next.

ParameterDescriptionExample
RegionRegion of the source ApsaraMQ for Kafka instance.China (Beijing)
Kafka instanceSource instance that produces messages.MQ_INST_115964845466\*\*\*\*\_ByBeUp3p
TopicTopic on the source instance to consume messages from.topic
Group IDConsumer group on the source instance. Use a dedicated consumer group for the event stream. Do not share it with existing services to avoid interfering with message consumption.GID_http_1
Concurrent quota (consumers)Number of consumers allocated to the source instance.1
Consumer offsetPosition from which to start consuming messages.Latest offset
Network ConfigurationNetwork type for message routing.Default network
Virtual Private CloudVPC ID. Required when Network Configuration is set to Public network.vpc-bp17fapfdj0dwzjkd\*\*\*\*
vSwitchvSwitch ID. Required when Network Configuration is set to Public network.vsw-bp1gbjhj53hdjdkg\*\*\*\*
Security groupSecurity group. Required when Network Configuration is set to Public network.alikafka\_pre-cn-7mz2\*\*\*\*
Data formatEncoding format for the data source. For message routing between Kafka instances, set this to Binary.Binary

Enable batch push (optional)

Enable Batch Push to aggregate multiple events into a single push. A push triggers when either threshold is reached first.

ParameterDescriptionExample
MessagesMaximum number of messages per batch. Valid values: 1 to 10000.100
Interval (Unit: Seconds)Maximum wait time before triggering a push. Valid values: 0 to 15. Set to 0 for immediate delivery.3

For example, if you set Messages to 100 and Interval to 15 seconds, and the message count reaches 100 within 10 seconds, the push triggers immediately without waiting for the full 15 seconds.

Configure filtering and transformation

In the Filtering and Transformation configuration wizards, define event filtering and transformation rules and click Next.

For more information about event transformation, see Use Function Compute for data cleansing.

Configure the sink

Set Service Type to ApsaraMQ for Kafka in the Sink configuration wizard, configure the following parameters, and click Save.

ParameterDescriptionExample
Instance IDDestination ApsaraMQ for Kafka instance.test
TopicTopic on the destination instance.test
Acknowledgment ModeAcknowledgment signal that the destination instance sends to the client after receiving data.None
Message ValueEventBridge extracts the specified data from an event in binary format, decodes it from Base64, and routes it to the destination. Use a JSONPath expression to specify the data field.Binary extraction: $.data.value
Message KeyEventBridge extracts data from an event using a JSONPath expression and routes the specified content to the destination.Partial event: $.data.key

Configure retry and dead-letter policies

Set the retry policy and dead-letter queue for the event stream. For more information, see Retries and dead-letter queues.

Enable the event stream

  1. Click Save to create the event stream.

  2. Return to the Event Streams page, find the event stream, and click Enable in the Actions column.

  3. In the Prompt dialog box, click Confirm.

The event stream takes 30 to 60 seconds to start. Monitor the startup progress in the Status column on the Event Streams page.

Verify message routing

After the event stream starts, send a test message from the source instance and verify that it arrives at the destination.

Send a test message

  1. Log on to the ApsaraMQ for Kafka console.

  2. In the top navigation bar, select the region of the source instance.

  3. In the left-side navigation pane, click Instances.

  4. Find the source instance and click Details in the Actions column.

  5. In the left-side navigation pane, click Topic Management.

  6. Click the name of the source topic that you configured when you created the event stream.

  7. On the topic details page, click Send Message for Trial.

  8. In the Send and Receive Message for Trial panel, set Sending Method to Console, enter a Message Key and Message Content, and click OK.

Verify the routed message

  1. Return to the Instances page.

  2. Find the destination instance and click Details in the Actions column.

  3. In the left-side navigation pane, click Topic Management.

  4. Click the name of the destination topic that you configured when you created the event stream.

  5. On the topic details page, click Message Query.

  6. Set Query Method, Partition, and Time Point, and click Query.

  7. Confirm that the queried key and value match the message you sent from the source instance.

Related topics