To configure Kafka Writer, select a table and configure field mappings.

Prerequisites

A reader or conversion node is configured. For more information, see Plug-ins for data sources that support real-time synchronization.

Procedure

  1. Go to the DataStudio page.
    1. Log on to the DataWorks console.
    2. In the left-side navigation pane, click Workspaces.
    3. Select the region where the required workspace resides, find the workspace, and then click Data Analytics.
  2. Move the pointer over the Create icon and choose Data Integration > Real-time synchronization.
    Alternatively, you can click the required workflow, right-click Data Integration, and then choose Create > Real-time synchronization.
  3. In the Create Node dialog box, set the Node Name and Location parameters.
    Notice The node name must be 1 to 128 characters in length. It can contain letters, digits, underscores (_), and periods (.).
  4. Click Commit.
  5. On the configuration tab of the real-time synchronization node, drag Kafka in the Output section to the canvas on the right. Connect the new node to the configured reader or conversion node.
  6. Click the Kafka node. In the Node Configuration panel, set the parameters.
    Node Configuration panel
    Parameter Description
    Kafka Cluster Address The address of the Kafka broker. Specify the address in the IP address:Port number format.
    Topic The name of the Kafka topic to which you want to write data. Kafka maintains feeds of messages in categories called topics.

    Each message that is published to a Kafka cluster is assigned to a topic. Each topic contains a group of messages.

    Key Column The column that is specified as the key.
    Value Column The column that is specified as the value. If you leave this parameter empty, all columns are concatenated by using the delimiter specified by the Column separator parameter to form the value.
    Key Type The data type of the keys in the Kafka topic.
    Value Type The data type of the values in the Kafka topic.
    Number of Bytes Written at a Time The number of data records to write at a time. Default value: 1024.
    Configuration parameters The extended parameters specified when KafkaConsumer is created, such as the bootstrap.servers, auto.commit.interval.ms, and session.timeout.ms parameters. You can set parameters in kafkaConfig to control the data consumption behavior of KafkaConsumer. For a real-time synchronization node that synchronizes data to a Kafka data source, the default value of the acks parameter for KafkaProducer is all. If you have higher requirements for performance, you can specify a different value for the acks parameter. Valid values of the acks parameter:
    • 0: does not check data writes.
    • 1: checks whether data is written to a topic and its replicas as expected.
    • all: checks whether data is written to all replicas of a topic as expected.
  7. Click Save the settings in the toolbar.