When you configure a Kafka writer, you only need to select a table and configure the field mappings.
Create a Kafka writer
- Log on to the DataWorks console. In the left-side navigation pane, click Workspaces. On the Workspaces page, find the target workspace and click Data Analytics in the Actions column.
- On the Data Analytics tab, move the pointer over the icon and choose
You can also find the target workflow, right-click Data Integration, and choose .
- In the Create Node dialog box that appears, set Node Name and Location, and then click Commit.
- On the configuration tab of the created real-time sync node, drag Kafka under to the editing panel. Connect it to the desired reader or transformation node in the panel.
- Click the Kafka writer node and set parameters in the Node Settings section.
Parameter Description server The broker server address of Kafka in the format of
topic The name of the topic to which data is written in Kafka. Kafka maintains feeds of messages in categories called topics.
Each message published to the Kafka cluster is assigned to a topic. Each topic contains a group of messages.
keyColumn The column that is specified as the key. valueColumn The column that is specified as the value. If this parameter is not specified, all columns are concatenated by using the delimiter specified by fieldDelimiter to form the value. keyType The type of the Kafka key. valueType The type of the Kafka value. batchSize The number of data records that are written at a time. Default value: 1024. Configuration parameters The extended parameters specified when KafkaConsumer is created, such as bootstrap.servers, auto.commit.interval.ms, and session.timeout.ms. By setting parameters in kafkaConfig, you can control the data consumption behaviors of KafkaConsumer.
- Click in the toolbar.