All Products
Search
Document Center

Tablestore:Get started with Tunnel Service

Last Updated:May 14, 2025

You can get started with Tunnel Service in the Tablestore console.

Procedure

Step 1: Create a tunnel

  1. Go to the Tunnels page.

    1. Log on to the Tablestore console.

    2. In the top navigation bar, select a resource group and a region.

    3. On the Overview page, click the instance name or click Manage Instance in the Actions column of the instance.

    4. On the Instance Details tab, click the Tables tab. Then, click the data table name and select the Tunnels tab. Alternatively, you can click fig_001 and select Tunnels from the shortcut menu in the Actions column of the data table.

  2. On the Tunnels tab, click Create Tunnel.

  3. In the Create Tunnel dialog box, configure the parameters. The following table describes the parameters.

    Parameter

    Description

    Tunnel Name

    The name of the tunnel.

    Type

    The type of the tunnel. Valid values:

    • Full: Only full data can be consumed and processed.

    • Incremental: Only incremental data can be consumed and processed.

    • Differential: After full data is consumed and processed, incremental data is consumed and processed.

    Start Time

    If you set the Type parameter to Incremental or Differential, the system considers the data that is written to the data table after the tunnel is created incremental. If you want to consume incremental data that is written to the data table after a specific point in time, configure the Start Time parameter for incremental data. You can also configure the End Time parameter for incremental data based on your business requirements.

    Valid value range of the Start Time and End Time parameters: [Current system time - Stream validity period + 5 minutes, Current system time]. The values of the two parameters is a 64-bit timestamp in milliseconds. The value of the End Time parameter must be greater than the value of the Start Time parameter.

    Important

    The Stream validity period is the validity period of incremental logs in milliseconds. The maximum Stream validity period is seven days. You can specify the Stream validity period when you enable Stream for the data table. You cannot modify the Stream validity period after you specify it.

    End Time

  4. Click OK.

    After the tunnel is created, click Show Channels in the Actions column of the tunnel. You can view the data content in the tunnel, consumption latency monitoring information, and the number of consumed data rows in each channel.

    image

Step 2: (Optional) Preview the data format in the tunnel

After you create a tunnel, you can simulate data consumption to preview the data types in the tunnel.

  1. Write data to or delete data from the table.

  2. On the Tunnels tab of the table, click Show Channels in the Actions column of the tunnel.

  3. In the Actions column of the channel, click View Simulated Export Records.

  4. In the View Simulated Export Records dialog box, click Start.

    The information of the consumed data is displayed in the dialog box, as shown in the following figure.

    image

    Example of consumed data format

    The following is an example of the consumed data format:

    {
      "sequenceInfo": {
        "epoch": 0,
        "rowIndex": 0,
        "timestamp": 0
      },
      "recordType": "PUT",
      "columns": [
        {
          "actionType": "PUT",
          "name": "create_time",
          "type": "STRING",
          "value": "2024-02-18 22:10:07"
        },
        {
          "actionType": "PUT",
          "name": "modified_time",
          "type": "STRING",
          "value": "2024-02-18 22:10:07"
        },
        {
          "actionType": "PUT",
          "name": "num",
          "type": "INTEGER",
          "value": 29
        },
        {
          "actionType": "PUT",
          "name": "order_status",
          "type": "STRING",
          "value": "00"
        },
        {
          "actionType": "PUT",
          "name": "price",
          "type": "DOUBLE",
          "value": 400
        },
        {
          "actionType": "PUT",
          "name": "sku_id",
          "type": "STRING",
          "value": "9000000007"
        },
        {
          "actionType": "PUT",
          "name": "total_price",
          "type": "DOUBLE",
          "value": 11600
        },
        {
          "actionType": "PUT",
          "name": "user_id",
          "type": "STRING",
          "value": "1000000042"
        }
      ],
      "primaryKey": [
        {
          "name": "order_id",
          "type": "STRING",
          "value": "90fb478c-1360-11f0-a34d-00163e30a2a9"
        }
      ],
      "timestamp": 0
    }

Step 3: Use the tunnel to consume data

  1. Copy the tunnel ID from the tunnel list on the Tunnels tab.

  2. Use Tablestore SDK for Java or Tablestore SDK for Go to quickly consume data by using the tunnel.

    After data consumption, you can view the incremental data consumption logs, such as consumption statistics and the latest synchronization time of incremental channels. You can also view consumption latency and the number of consumed data rows in each channel in the Tablestore console.

Development integration

You can use the Tablestore CLI or Tablestore SDKs to perform tunnel operations.

Feature

Invocation method

Create a tunnel

Use Tablestore SDKs: Java and Go

Use the Tablestore CLI

Query information about all tunnels of a data table

Use Tablestore SDKs: Java and Go

Use the Tablestore CLI

Query information about a tunnel

Use Tablestore SDKs: Java and Go

Use the Tablestore CLI

Simulate data consumption by using a tunnel

Use the Tablestore CLI

Delete a tunnel

Use Tablestore SDKs: Java and Go

Use the Tablestore CLI