To configure Kafka Writer, select a table and configure the field mappings.

Prerequisites

A reader or transformation node is configured. For more information, see Supported data stores for real-time synchronization.

Procedure

  1. Go to the DataStudio page.
    1. Log on to the DataWorks console.
    2. In the left-side navigation pane, click Workspaces.
    3. Select the region where the required workspace resides, find the workspace, and then click Data Analytics.
  2. Move the pointer over the Create icon and choose Data Integration > Real-time synchronization.
    Alternatively, you can click the required workflow, right-click Data Integration, and then choose Create > Real-time synchronization.
  3. In the Create Node dialog box, set the Node Name and Location parameters.
    Notice The node name must be 1 to 128 characters in length. It can contain letters, digits, underscores (_), and periods (.).
  4. Click Commit.
  5. On the configuration tab of the real-time sync node, drag Kafka under Output to the canvas on the right. Connect the new node to a reader or transformation node.
  6. Click the new Kafka node. In the configuration pane that appears, set the required parameters in the Node configuration section.
    Node configuration
    Parameter Description
    server The broker server address of Kafka in the format of ip:port.
    topic The name of the topic to which data is written in Kafka. Kafka maintains feeds of messages in categories called topics.

    Each message published to the Kafka cluster is assigned to a topic. Each topic contains a group of messages.

    keyColumn The column that is specified as the key.
    ValueColumn The column that is specified as the value. If this parameter is not specified, all columns are concatenated by using the delimiter specified by fieldDelimiter to form the value.
    keyType The type of the key in Kafka.
    valueType The type of the value in Kafka.
    batchSize The number of data records that are written at a time. Default value: 1024.
    Configuration parameters The extended parameters specified when KafkaConsumer is created, such as bootstrap.servers, auto.commit.interval.ms, and session.timeout.ms. You can set parameters in kafkaConfig to control the data consumption behaviors of KafkaConsumer.
  7. Click the Save icon in the toolbar.