When you call the ModifyDtsJobConfig operation to modify the parameters of a Data Transmission Service (DTS) task, you must configure the Parameters parameter.
Parameter description
The Parameters parameter is a string in JSON format. To modify a parameter, specify the following fields:
module
The module to which the parameter belongs. Valid values:
03: The Full Data Migration module for data migration or synchronization tasks.
04: The Incremental Write module for a data migration task.
07: The Incremental Write module for data synchronization tasks.
60: The synchronization module for Kafka-to-Kafka and SLS-to-ClickHouse tasks.
name
The parameter name. For more information, see the following table.
value
The parameter value. For more information, see the following table.
Parameter (name) | Configurable module (module) | Description (value) |
source.column.encoding | 03, 04, or 07 | The character encoding of the business data written to the source Oracle database. Use this parameter to prevent garbled text that can occur when the database character set is incompatible with the business data's character set. Valid values:
|
sink.worker.type | 03 | Use the sink.worker.type and sink.bulk.copy.enable parameters to configure the data write mode for MySQL or PostgreSQL databases.
|
sink.bulk.copy.enable | 03 | |
sink.batch.size.minimum | 03 | The minimum number of records that a write thread writes to the destination database in a single batch. The value must be an integer from 0 to 1,024. |
sink.batch.size.maximum | 03, 04, or 07 |
The value must be an integer from 0 to 1,024. |
source.connection.idle.second | 03, 04, or 07 | The timeout period for reconnecting to the source database. If DTS reconnects to the source database within the timeout period, the task automatically resumes. Otherwise, the task fails. The value must be an integer from 0 to 86,400. Unit: seconds. |
sink.connection.idle.second | 03, 04, or 07 | The timeout period for reconnecting to the destination database. If DTS reconnects to the destination database within the timeout period, the task automatically resumes. Otherwise, the task fails. The value must be an integer from 0 to 86,400. Unit: seconds. |
trans.hot.merge.enable | 04 or 07 | Specifies whether to enable hot spot merging. Valid values:
|
sink.batch.enable | 4, 7 | Specifies whether to send data in batches. Valid values:
|
source.filter.ddl.enable | 04 and 07 | Specifies whether to filter DDL statements. Valid values:
|
sink.ignore.failed.ddl | 04, 07 | Specifies whether to ignore DDL statements that fail to execute. Valid values:
|
trans.size.maximum | 04, 07 | The threshold for transaction splitting. The value must be an integer from 0 to 1,024. |
dts.datamove.record.spouter.writers | 07 | The number of data write threads. The value must be an integer from 0 to 64. |
selectdb.reservoir.group.by.target.schema | 04 or 07 | Specifies whether to batch data by the destination table name when writing. Valid values:
|
selectdb.reservoir.timeout.milliseconds | 04 or 07 | The batching time for a single data write, in milliseconds.
|
sink.task.number | 03, 04, 07, or 60 | The number of threads to write data to the destination database. Increasing this value can improve write performance in scenarios without hot spots, but it also increases the load on the destination database.
|
Example
The following example shows the parameter in JSON format:
[
{
"module": "07",
"name": "sink.connection.idle.second",
"value": 60
},
{
"module": "07",
"name": "sink.batch.size.maximum",
"value": 64
}
]