When you call an API operation to configure or query a Data Transmission Service (DTS) task, you must specify or query the Reserve parameter. The Reserve parameter allows you to view the configurations of the source or destination instance or add more configurations to the DTS task. For example, you can specify the data storage format of the destination Kafka database and the ID of the Cloud Enterprise Network (CEN) instance. This topic describes the use scenarios and sub-parameters of the Reserve parameter.

Related API operations

Description

The value of the Reserve parameter is a JSON string. You can specify the Reserve parameter in the following scenarios.

Note If you specify a numeric value, you must enclose the value in double quotation marks ("").
  • You can use the following Reserve parameter to set Processing Mode of Conflicting Tables for a data migration or synchronization task.
    Parameter Required Description
    targetTableMode Yes The processing mode of conflicting tables. Valid values:
    • 0: performs a precheck and reports errors.
    • 2: ignores errors and proceeds.
  • If the database type of the destination instance is Kafka, you can use the Reserve parameter to configure the information about the Kafka instance and the storage format in which data is shipped to the Kafka instance.
    Parameter Required Description
    destTopic Yes The topic to which the objects migrated or synchronized to the destination Kafka instance belong.
    destVersion Yes The engine version of the destination Kafka instance. Valid values: 1.0, 0.9, and 0.10.
    Note If the engine version of the destination Kafka instance is 1.0 or later, you must set the value to 1.0.
    destSSL Yes Specifies whether to encrypt the connection to the destination Kafka instance. Valid values:
    • 0: does not encrypt the connection.
    • 3: encrypts the connection by using the SCRAM-SHA-256 algorithm.
    dest.kafka.schema.registry.url No If you use Kafka Schema Registry, you must enter the URL or IP address that is registered in Kafka Schema Registry for your Avro schemas.
    sink.kafka.ddl.topic No The topic that is used to store the DDL information. If you do not specify this parameter, the DDL information is stored in the topic that is specified by the destTopic parameter.
    kafkaRecordFormat Yes The storage format in which data is shipped to the destination Kafka instance. Valid values:
    • canal_json: Canal is used to parse the incremental logs of the source database and transfer the incremental data to the destination Kafka instance in the Canal JSON format.
    • dts_avro: Avro is a data serialization format into which data structures or objects can be converted to facilitate storage and transmission.
    • shareplex_json: The data replication software SharePlex is used to read the data in the source database and write the data to the destination Kafka instance in the SharePlex JSON format.
    • debezium: Debezium is a tool used to capture data changes. Debezium supports real-time streaming of data updates from the source PolarDB for Oracle cluster to the destination Kafka instance.
    Note For more information, see Data formats of a Kafka cluster.
    destKafkaPartitionKey Yes The policy used to synchronize data to Kafka partitions. Valid values:
    • none: DTS synchronizes all data and DDL statements to Partition 0 of the destination topic.
    • database_table: DTS uses the database and table names as the partition key to calculate the hash value. Then, DTS synchronizes the data and DDL statements of each table to the corresponding partition of the destination topic.
    • columns: DTS uses a table column as the partition key to calculate the hash value. The table column is the primary key by default. If a table does not have a primary key, the unique key is used as the partition key. DTS synchronizes each row to the corresponding partition of the destination topic. You can specify one or more columns as partition keys to calculate the hash value.
    Note For more information about synchronization policies, see Specify the policy for synchronizing data to Kafka partitions.
    Example:
    {
        "destTopic": "dtstestdata",
        "destVersion": "1.0",
        "destSSL": "0",
        "dest.kafka.schema.registry.url": "http://12.1.12.**/api",
        "sink.kafka.ddl.topic": "dtstestdata",
        "kafkaRecordFormat": "canal_json",
        "destKafkaPartitionKey": "none"
    }
  • If the database type of the source or destination instance is MongoDB, you must use the Reserve parameter to specify the architecture type of the MongoDB database.
    Parameter Required Description Example
    srcEngineArchType Yes The architecture type of the source MongoDB database.
    • 0: standalone architecture
    • 1: replica set architecture
    • 2: sharded cluster architecture
    {
         "srcEngineArchType": "1"  }
    destEngineArchType Yes The architecture type of the destination MongoDB database.
    • 0: standalone architecture
    • 1: replica set architecture
    • 2: sharded cluster architecture
    {
         "destEngineArchType": "1"  }
  • If the source or destination instance is a self-managed database connected over CEN, you must use the Reserve parameter to specify the ID of the CEN instance.
    Parameter Required Description Example
    srcInstanceId Yes The ID of the CEN instance for the source instance.
    {
         "srcInstanceId": "cen-9kqshqum*******"  }
    destInstanceId Yes The ID of the CEN instance for the destination instance.
    {
         "destInstanceId": "cen-9kqshqum*******"  }
  • If the destination instance is a DataHub or MaxCompute project, you must specify the naming rules for additional columns.
    Parameter Required Description
    isUseNewAttachedColumn Yes The naming rules for additional columns. Valid values:
    • true: uses the new naming rules.
    • false: uses the old naming rules. In this case, make sure that disableAttachedDTSColumn is not set to true.
    • To disable the default rules, set disableAttachedDTSColumn to true and isUseNewAttachedColumn to false.
    disableAttachedDTSColumn No
  • If the source instance is an Oracle database, you must specify the type of the database.
    Parameter Required Description
    srcOracleType Yes The type of the Oracle database.
    • sid: non-RAC
    • serviceName: RAC or PDB