When you configure a Kafka writer, you only need to select a table and configure the field mappings.

Create a Kafka writer

  1. Log on to the DataWorks console. In the left-side navigation pane, click Workspaces. On the Workspaces page, find the target workspace and click Data Analytics in the Actions column.
  2. On the Data Analytics tab, move the pointer over the Create a Kafka writer icon and choose Data Integration > Real-Time Sync.

    You can also find the target workflow, right-click Data Integration, and choose Create > Real-Time Sync.

  3. In the Create Node dialog box that appears, set Node Name and Location, and then click Commit.
  4. On the configuration tab of the created real-time sync node, drag Kafka under Writer to the editing panel. Connect it to the desired reader or transformation node in the panel.
  5. Click the Kafka writer node and set parameters in the Node Settings section.
    Parameter Description
    server The broker server address of Kafka in the format of ip:port.
    topic The name of the topic to which data is written in Kafka. Kafka maintains feeds of messages in categories called topics.

    Each message published to the Kafka cluster is assigned to a topic. Each topic contains a group of messages.

    keyColumn The column that is specified as the key.
    valueColumn The column that is specified as the value. If this parameter is not specified, all columns are concatenated by using the delimiter specified by fieldDelimiter to form the value.
    keyType The type of the Kafka key.
    valueType The type of the Kafka value.
    batchSize The number of data records that are written at a time. Default value: 1024.
    Configuration parameters The extended parameters specified when KafkaConsumer is created, such as bootstrap.servers, auto.commit.interval.ms, and session.timeout.ms. By setting parameters in kafkaConfig, you can control the data consumption behaviors of KafkaConsumer.
  6. Click Save the settings in the toolbar.