All Products
Search
Document Center

Lindorm:Import incremental data from Log Service

Last Updated:Jul 18, 2023

This topic describes how to import incremental data from Log Service to a Lindorm wide table in the Lindorm console.

Usage notes

This feature is no longer available for LTS instances that are purchased after June 16, 2023. If your LTS instance is purchased before June 16, 2023, you can still use this feature.

Prerequisites

Supported destination table types

Tables that are created by executing Lindorm SQL statements are supported.

Procedure

  1. Log on to the LTS web UI and choose Import Lindorm/HBase > SLS incremental Import.

  2. On the SLS incremental Import page, click create new job.

  3. Configure Tunnel Name, select the source cluster and destination cluster, and enter the table to be synchronized or migrated.

  4. Click create. View the channel details after the channel is created.

Parameters

{
  "reader": {
    "columns": [
      "__client_ip__",
      "C_Source",
      "id",
      "name"
    ],
    "consumerSize": 2, // This parameter specifies the number of consumers that subscribe to the LogHub data. The default value is 1.
    "logstore": "LTS-test"
  },
  "writer": {
    "columns": [
      {
        "name": "col1",
        "value": "{{ concat('xx', name) }}" // This parameter supports expressions.
      },
      {
        "name": "col2",
        "value": "__client_ip__" // This parameter specifies the mapped column name.
      },
      {
        {
            "isPk": true , // This parameter specifies whether the columns are included in the primary key.
            "name":"id",// You do not need to specify the column family for the primary key.
            "value":"id"
        }
      }
    ]
    "tableName": "default.sls"
  }
}
            

The following simple Jtwig syntax is supported. For more information about Jtwig syntax, see Jtwig syntax.

{
  "name": "hhh",
  "value": "{{ concat(title, id) }}"
}