This topic describes how to create a sink connector task in the ApsaraMQ for Kafka console to synchronize data from ApsaraMQ for Kafka to Function Compute.
Prerequisites
An ApsaraMQ for Kafka instance is purchased and deployed. Make sure that the instance is in the Running state. For more information, see Step 2: Purchase and deploy an instance.
Function Compute is activated. For more information, see Activate Function Compute.
EventBridge is activated and the required permissions are granted to a Resource Access Management (RAM) user. For more information, see Activate EventBridge and grant permissions to a RAM user.
What is Function Compute?
Function Compute is a fully managed and event-driven serverless computing service. Function Compute allows you to focus on writing and uploading code without the need to manage infrastructure resources such as servers. Function Compute prepares elastic computing resources for you to run code in a reliable manner. For more information, see What is Function Compute?
What can Function Compute be used for?
You can use functions to process business messages and the Function Compute platform to develop and run the processing logic of business messages. You can also use Function Compute to process orders and execute tasks.
You can use functions to quickly process messages and perform data cleansing based on extract, transform, and load (ETL).
You can use Function Compute to dump messages to other downstream systems in a specific virtual private cloud (VPC).
You can use functions to connect a messaging system and another Alibaba Cloud service to deliver messages to the service.
Create a sink connector task
Log on to the ApsaraMQ for Kafka. In the left-side navigation pane, choose .
In the top navigation bar, select a region, such as China (Hangzhou). On the Tasks page, click Create Task.
On the Create Task page, configure the Task Name and Description parameters and follow the on-screen instructions to configure other parameters. Then, click Save. The following section describes the parameters:
Task Creation
In the Source step, set the Data Provider parameter to Message Queue for Apache Kafka and follow the on-screen instructions to configure other parameters. Then, click Next Step. The following table describes the parameters.
Parameter
Description
Example
Region
The region where the ApsaraMQ for Kafka instance resides.
China (Beijing)
Message Queue for Apache Kafka Instance
The ApsaraMQ for Kafka instance on which the messages that you want to route are produced.
MQ_INST_115964845466****_ByBeUp3p
Topic
The topic on the ApsaraMQ for Kafka instance on which the messages that you want to route are produced.
topic
Group ID
The name of the consumer group on the ApsaraMQ for Kafka instance. You must use a separate consumer group to create the message routing source. Do not use the same consumer group for ApsaraMQ for Kafka and another existing messaging service. Otherwise, you may fail to send or receive messages by using the existing messaging service.
GID_http_1
Concurrency Quota (Consumers)
The number of consumers on the ApsaraMQ for Kafka instance.
1
Consumer Offset
The offset from which messages are consumed.
Latest Offset
Network Configuration
The type of the network over which you want to route messages.
Basic Network
VPC
The ID of the VPC in which the ApsaraMQ for Kafka instance is deployed. This parameter is required only if you set the Network Configuration parameter to Internet.
vpc-bp17fapfdj0dwzjkd****
vSwitch
The ID of the vSwitch with which the ApsaraMQ for Kafka instance is associated. This parameter is required only if you set the Network Configuration parameter to Internet.
vsw-bp1gbjhj53hdjdkg****
Security Group
The security group to which the ApsaraMQ for Kafka instance belongs. This parameter is required only if you set the Network Configuration parameter to Internet.
alikafka_pre-cn-7mz2****
Batch Push
The batch push feature helps you aggregate multiple events at a time. This feature is triggered if the condition that is specified by the Messages parameter or the Interval (Unit: Seconds) parameter is met.
For example, if you set the Messages parameter to 100 and the Interval (Unit: Seconds) parameter to 15, the push is executed when the number of messages reaches 100 even if only 10 seconds are elapsed.
Enable
Messages
The maximum number of messages that are sent in each function invocation. Requests are sent only when the number of messages in the backlog reaches the specified value. Valid values: 1 to 10000.
100
Interval (Unit: Seconds)
The time interval at which the function is invoked. The system sends the aggregated messages to Function Compute at the specified time interval. Valid values: 0 to 15. Unit: seconds. The value 0 indicates that messages are sent immediately after aggregation.
3
In the Filtering step, define a data pattern in the Pattern Content code editor to filter requests. For more information, see Message filtering.
In the Transformation step, specify a data cleansing method to implement data processing capabilities such as splitting, mapping, enrichment, and dynamic routing. For more information, see Data cleansing.
In the Sink step, set the Service Type parameter to Function Compute and follow the on-screen instructions to configure other parameters. The following table describes the parameters.
Parameter
Description
Example
Service
Select the Function Compute service that you created.
test
Function
Select the Function Compute function that you created.
test
Service Version and Alias
Select a service version or alias.
Default Version
Invocation Mode
Select Synchronous or Asynchronous.
Asynchronous
Event
Select an event transformation method. For more information, see Event transformation.
Complete Data
Task Property
Configure the retry policy and dead-letter queue for the task. For more information, see Retry policies and dead-letter queues.
Go back to the Tasks page, find the sink connector task that you created, and then click Enable in the Actions column.
In the Note message, click OK.
Enabling a sink connector task requires 30 to 60 seconds to complete. You can view the progress in the Status column on the Tasks page.
Other operations
On the Tasks page, find the message outflow task that you want to manage and perform other operations in the Actions column. The following items describe the operations that you can perform:
View the task details: Click Details in the Actions column. On the page that appears, view the basic information, properties, and monitoring metrics of the task.
Change the task configurations: Click Edit in the Actions column. In the Edit Task panel, change the configurations of the task.
Enable or disable the task: Click Enable or Pause in the Actions column. In the Note message, click OK.
Delete the task: Click Delete in the Actions column. In the Note message, click OK.