All Products
Search
Document Center

DataWorks:GetDIJob

Last Updated:Jan 12, 2026

Queries the information about a synchronization task.

Operation description

This API operation is available for all DataWorks editions.

Debugging

You can run this interface directly in OpenAPI Explorer, saving you the trouble of calculating signatures. After running successfully, OpenAPI Explorer can automatically generate SDK code samples.

Authorization information

There is currently no authorization information disclosed in the API.

Request parameters

ParameterTypeRequiredDescriptionExample
IdlongNo

The ID of the synchronization task.

11588
WithDetailsbooleanNo

Specifies whether to return detailed configuration information, including TransformationRules, TableMappings, and JobSettings. Valid values: true and false. Default value: true.

true
ProjectIdlongNo

The DataWorks workspace ID. You can log on to the DataWorks console and go to the Workspace page to query the ID.

You must configure this parameter to specify the DataWorks workspace to which the API operation is applied.

10000
DIJobIddeprecatedstringNo

This parameter is deprecated. Use the Id parameter instead.

11588

Response parameters

ParameterTypeDescriptionExample
object
RequestIdstring

The request ID. You can use the ID to query logs and troubleshoot issues.

C99E2BE6-9DEA-5C2E-8F51-1DDCFEADE490
PagingInfoobject

The pagination information.

Idlong

The ID of the synchronization task.

32601
Descriptionstring

The description of the synchronization task.

description
DestinationDataSourceSettingsarray<object>

The properties of the destination.

DestinationDataSourceSettingsobject

The properties of the destination.

DataSourceNamestring

The name of the data source.

dw_mysql
DestinationDataSourceTypestring

The destination type. Valid values: Hologres, OSS-HDFS, OSS, MaxCompute, LogHub, StarRocks, DataHub, AnalyticDB_For_MySQL, Kafka, Hive.

Hologres
JobNamestring

The name of the synchronization task.

imp_ods_dms_det_dealer_info_df
JobSettingsobject

The runtime settings.

ChannelSettingsstring

The channel control settings for the synchronization task. You can configure special channel control settings for the following synchronization links: data synchronization between Hologres data sources and data synchronization from Hologres to Kafka.

  1. Holo2Kafka
  • Example: {"destinationChannelSettings":{"kafkaClientProperties":[{"key":"linger.ms","value":"100"}],"keyColumns":["col3"],"writeMode":"canal"}}
  • kafkaClientProperties: the parameters related to a Kafka producer, which are used when you write data to a Kafka data source.
  • keyColumns: the names of Kafka columns to which data is written.
  • writeMode: the writing format. Valid values: json and canal.
  1. Holo2Holo
  • Example: {"destinationChannelSettings":{"conflictMode":"replace","dynamicColumnAction":"replay","writeMode":"replay"}}
  • conflictMode: the policy used to handle a conflict that occurs during data writing to Hologres. Valid values: replace and ignore.
  • writeMode: the mode in which data is written to Hologres. Valid values: replay and insert.
  • dynamicColumnAction: the mode in which data is written to dynamic columns in a Hologres table. Valid values: replay, insert, and ignore.
{"structInfo":"MANAGED","storageType":"TEXTFILE","writeMode":"APPEND","partitionColumns":[{"columnName":"pt","columnType":"STRING","comment":""}],"fieldDelimiter":""}
ColumnDataTypeSettingsarray<object>

The data type mappings between source fields and destination fields.

ColumnDataTypeSettingsobject

The data type mapping between a source field and a destination field.

DestinationDataTypestring

The data type of the destination field. Valid values: bigint, boolean, string, text, datetime, timestamp, decimal, and binary. Different types of data sources support different data types.

text
SourceDataTypestring

The data type of the source field. Valid values: bigint, boolean, string, text, datetime, timestamp, decimal, and binary. Different types of data sources support different data types.

bigint
CycleScheduleSettingsobject

The settings for periodic scheduling.

CycleMigrationTypestring

The synchronization type that requires periodic scheduling. Valid values:

  • Full: full synchronization
  • OfflineIncremental: batch incremental synchronization
Full
ScheduleParametersstring

The scheduling parameters.

bizdate=$bizdate
DdlHandlingSettingsarray<object>

The DDL operation types. Valid values:

  • RenameColumn
  • ModifyColumn
  • CreateTable
  • TruncateTable
  • DropTable
  • DropColumn
  • AddColumn
DdlHandlingSettingsobject

The type of the DDL operation. Valid values:

  • RenameColumn
  • ModifyColumn
  • CreateTable
  • TruncateTable
  • DropTable
  • DropColumn
  • AddColumn
Actionstring

The processing policy for a specific type of DDL message. Valid values:

  • Ignore: ignores a DDL message.
  • Critical: reports an error for a DDL message.
  • Normal: normally processes a DDL message.
Ignore
Typestring

The DDL operation type. Valid values:

  • RenameColumn
  • ModifyColumn
  • CreateTable
  • TruncateTable
  • DropTable
CreateTable
RuntimeSettingsarray<object>

The runtime settings.

RuntimeSettingsobject

The runtime setting.

Namestring

The name of the configuration item. Valid values:

  • src.offline.datasource.max.connection: indicates the maximum number of connections that are allowed for reading data from the source of a batch synchronization task.
  • dst.offline.truncate: indicates whether to clear the destination table before data writing.
  • runtime.offline.speed.limit.enable: indicates whether throttling is enabled for a batch synchronization task.
  • runtime.offline.concurrent: indicates the maximum number of parallel threads that are allowed for a batch synchronization task.
  • runtime.enable.auto.create.schema: indicates whether schemas are automatically created in the destination of a synchronization task.
  • runtime.realtime.concurrent: indicates the maximum number of parallel threads that are allowed for a real-time synchronization task.
  • runtime.realtime.failover.minute.dataxcdc: indicates the maximum waiting duration before a synchronization task retries the next restart if the previous restart fails after failover occurs. Unit: minutes.
  • runtime.realtime.failover.times.dataxcdc: indicates the maximum number of failures that are allowed for restarting a synchronization task after failovers occur.
runtime.offline.concurrent
Valuestring

The value of the configuration item.

1
MigrationTypestring

The synchronization type. Valid values:

  • FullAndRealtimeIncremental: full synchronization and real-time incremental synchronization of data in an entire database
  • RealtimeIncremental: real-time incremental synchronization of data in a single table
  • Full: full batch synchronization of data in an entire database
  • OfflineIncremental: batch incremental synchronization of data in an entire database
  • FullAndOfflineIncremental: full synchronization and batch incremental synchronization of data in an entire database
FullAndRealtimeIncremental
JobTypestring

任务类型

  • DatabaseRealtimeMigration(整库实时):将源端多个库的多个表进行流同步,支持仅全量,仅增量,或全量+增量。

  • DatabaseOfflineMigration(整库离线):将源端多个库的多个表进行批同步,支持仅全量,仅增量,或全量+增量。

  • SingleTableRealtimeMigration(单表实时):将源端单个表进行流同步。

DatabaseRealtimeMigration
ProjectIdlong

The DataWorks workspace ID. You can log on to the DataWorks console and go to the Workspace page to query the ID.

This parameter indicates the DataWorks workspace to which the API operation is applied.

98330
ResourceSettingsobject

The resource settings.

OfflineResourceSettingsobject

The resource used for batch synchronization.

RequestedCudouble

The number of compute units (CUs) in the resource group for scheduling that are used for batch synchronization.

2.0
ResourceGroupIdentifierstring

The identifier of the resource group for Data Integration used for batch synchronization.

S_res_group_7708_1667792816832
RealtimeResourceSettingsobject

The resource used for real-time synchronization.

RequestedCudouble

The number of CUs in the resource group for Data Integration that are used for real-time synchronization.

2.0
ResourceGroupIdentifierstring

The identifier of the resource group for Data Integration used for real-time synchronization.

S_res_group_235454102432001_1579085295030
ScheduleResourceSettingsobject

The resource used for scheduling.

RequestedCudouble

The number of CUs in the resource group for Data Integration that are used for scheduling.

2.0
ResourceGroupIdentifierstring

The identifier of the resource group for scheduling used by the synchronization task.

S_res_group_235454102432001_1718359176885
SourceDataSourceSettingsarray<object>

The settings of the source. Only a single source is supported.

SourceDataSourceSettingsobject

The settings of the source. Only a single source is supported.

DataSourceNamestring

The name of the data source.

dw_mysql
DataSourcePropertiesobject

The properties of the data source.

Encodingstring

The encoding format of the database.

UTF-8
Timezonestring

The time zone.

GMT+8
SourceDataSourceTypestring

The source type. Valid values: PolarDB, MySQL, Kafka, LogHub, Hologres, Oracle, OceanBase, MongoDB, RedShift, Hive, SQLServer, Doris, ClickHouse.

Mysql
TableMappingsarray<object>

The list of mappings between rules used to select synchronization objects in the source and transformation rules applied to the selected synchronization objects. Each entry in the list displays a mapping between a rule used to select synchronization objects and a transformation rule applied to the selected synchronization objects.

Note [ { "SourceObjectSelectionRules":[ { "ObjectType":"Database", "Action":"Include", "ExpressionType":"Exact", "Expression":"biz_db" }, { "ObjectType":"Schema", "Action":"Include", "ExpressionType":"Exact", "Expression":"s1" }, { "ObjectType":"Table", "Action":"Include", "ExpressionType":"Exact", "Expression":"table1" } ], "TransformationRuleNames":[ { "RuleName":"my_database_rename_rule", "RuleActionType":"Rename", "RuleTargetType":"Schema" } ] } ]
TableMappingsobject

Each rule defines a table that needs to be synchronized.

SourceObjectSelectionRulesarray<object>

The list of rules used to select synchronization objects in the source.

SourceObjectSelectionRulesobject

The rule used to select synchronization objects in the source. The objects can be databases or tables.

Actionstring

The operation that is performed to select objects. Valid values: Include and Exclude.

Include
Expressionstring

The expression.

mysql_table_1
ExpressionTypestring

The expression type. Valid values: Exact and Regex.

Exact
ObjectTypestring

The object type. Valid values:

  • Table
  • Schema
  • Database
Table
TransformationRulesarray<object>

The list of transformation rules that are applied to the synchronization objects selected from the source. Each entry in the list defines a transformation rule.

TransformationRuleNamesobject

The transformation rule that is applied to the synchronization objects selected from the source.

RuleNamestring

The name of the rule. If the values of the RuleActionType parameter and the RuleTargetType parameter are the same for multiple transformation rules, you must make sure that the transformation rule names are unique.

rename_rule_1
RuleActionTypestring

The action type. Valid values:

  • DefinePrimaryKey
  • Rename
  • AddColumn
  • HandleDml
AddColumn
RuleTargetTypestring

The type of the object on which the action is performed. Valid values:

  • Table
  • Schema
  • Database
Table
TransformationRulesarray<object>

The list of transformation rules that are applied to the synchronization objects selected from the source.

Note [ { "RuleName":"my_database_rename_rule", "RuleActionType":"Rename", "RuleTargetType":"Schema", "RuleExpression":"{"expression":"${srcDatasoureName}_${srcDatabaseName}"}" } ]
TransformationRulesobject

The transformation rule that is applied to the synchronization objects selected from the source.

RuleActionTypestring

The action type. Valid values:

  • DefinePrimaryKey
  • Rename
  • AddColumn
  • HandleDml
  • DefineIncrementalCondition
  • DefineCycleScheduleSettings
  • DefinePartitionKey
Rename
RuleExpressionstring

The expression of the rule. The expression is a JSON string.

  1. Example of a renaming rule
  • Example: {"expression":"${srcDatasourceName}_${srcDatabaseName}_0922" }
  • expression: the expression of the renaming rule. You can use the following variables in an expression: ${srcDatasourceName}, ${srcDatabaseName}, and ${srcTableName}. ${srcDatasourceName} indicates the name of the source. ${srcDatabaseName} indicates the name of a source database. ${srcTableName} indicates the name of a source table.
  1. Example of a column addition rule
  • Example: {"columns":[{"columnName":"my_add_column","columnValueType":"Constant","columnValue":"123"}]}
  • If no rule of this type is configured, no fields are added to the destination and no values are assigned by default.
  • columnName: the name of the field that is added.
  • columnValueType: the value type of the field. Valid values: Constant and Variable.
  • columnValue: the value of the field. If the columnValueType parameter is set to Constant, the value of the columnValue parameter is a constant of the STRING data type. If the columnValueType parameter is set to Variable, the value of the columnValue parameter is a built-in variable. The following built-in variables are supported: EXECUTE_TIME (LONG data type), DB_NAME_SRC (STRING data type), DATASOURCE_NAME_SRC (STRING data type), TABLE_NAME_SRC (STRING data type), DB_NAME_DEST (STRING data type), DATASOURCE_NAME_DEST (STRING data type), TABLE_NAME_DEST (STRING data type), and DB_NAME_SRC_TRANSED (STRING data type). EXECUTE_TIME indicates the execution time. DB_NAME_SRC indicates the name of a source database. DATASOURCE_NAME_SRC indicates the name of the source. TABLE_NAME_SRC indicates the name of a source table. DB_NAME_DEST indicates the name of a destination database. DATASOURCE_NAME_DEST indicates the name of the destination. TABLE_NAME_DEST indicates the name of a destination table. DB_NAME_SRC_TRANSED indicates the database name obtained after a transformation.
  1. Example of a rule used to specify primary key fields for a destination table
  • Example: {"columns":["ukcolumn1","ukcolumn2"]}
  • If no rule of this type is configured, the primary key fields in the mapped source table are used for the destination table by default.
  • If the destination table is an existing table, Data Integration does not modify the schema of the destination table. If the specified primary key fields do not exist in the destination table, an error is reported when the synchronization task starts to run.
  • If the destination table is automatically created by the system, Data Integration automatically creates the schema of the destination table. The schema contains the primary key fields that you specify. If the specified primary key fields do not exist in the destination table, an error is reported when the synchronization task starts to run.
  1. Example of a rule used to process DML messages
  • Example: {"dmlPolicies":[{"dmlType":"Delete","dmlAction":"Filter","filterCondition":"id > 1"}]}
  • If no rule of this type is configured, the default processing policy for messages generated for insert, update, and delete operations is Normal.
  • dmlType: the DML operation. Valid values: Insert, Update, and Delete.
  • dmlAction: the processing policy for DML messages. Valid values: Normal, Ignore, Filter, and LogicalDelete. Filter indicates conditional processing. The value Filter is returned for the dmlAction parameter only when the value of the dmlType parameter is Update or Delete.
  • filterCondition: the condition used to filter DML messages. This parameter is returned only when the value of the dmlAction parameter is Filter.
  1. Example of a rule used to perform incremental synchronization
  • Example: {"where":"id > 0"}
  • The rule used to perform incremental synchronization is returned.
  1. Example of a rule used to configure scheduling parameters for an auto triggered task
  • Example: {"cronExpress":" * * * * * *", "cycleType":"1"}
  • The rule used to configure scheduling parameters for an auto triggered task is returned.
  1. Example of a rule used to specify a partition key
  • Example: {"columns":["id"]}
  • The rule used to specify a partition key is returned.
{"expression":"${srcDatasoureName}_${srcDatabaseName}"}
RuleNamestring

The name of the rule. If the values of the RuleActionType parameter and the RuleTargetType parameter are the same for multiple transformation rules, you must make sure that the transformation rule names are unique.

rename_rule_1
RuleTargetTypestring

The type of the object on which the action is performed. Valid values:

  • Table
  • Schema
  • Database
Table
JobStatusstring

The status of the job.

Running
DIJobIddeprecatedstring

This parameter is deprecated. Use the Id parameter instead.

32601

Examples

Sample success responses

JSONformat

{
  "RequestId": "C99E2BE6-9DEA-5C2E-8F51-1DDCFEADE490",
  "PagingInfo": {
    "Id": 32601,
    "Description": "description",
    "DestinationDataSourceSettings": [
      {
        "DataSourceName": "dw_mysql"
      }
    ],
    "DestinationDataSourceType": "Hologres",
    "JobName": "imp_ods_dms_det_dealer_info_df",
    "JobSettings": {
      "ChannelSettings": {
        "structInfo": "MANAGED",
        "storageType": "TEXTFILE",
        "writeMode": "APPEND",
        "partitionColumns": [
          {
            "columnName": "pt",
            "columnType": "STRING",
            "comment": ""
          }
        ],
        "fieldDelimiter": ""
      },
      "ColumnDataTypeSettings": [
        {
          "DestinationDataType": "text",
          "SourceDataType": "bigint"
        }
      ],
      "CycleScheduleSettings": {
        "CycleMigrationType": "Full",
        "ScheduleParameters": "bizdate=$bizdate\n"
      },
      "DdlHandlingSettings": [
        {
          "Action": "Ignore",
          "Type": "CreateTable"
        }
      ],
      "RuntimeSettings": [
        {
          "Name": "runtime.offline.concurrent\n",
          "Value": 1
        }
      ]
    },
    "MigrationType": "FullAndRealtimeIncremental",
    "JobType": "DatabaseRealtimeMigration",
    "ProjectId": 98330,
    "ResourceSettings": {
      "OfflineResourceSettings": {
        "RequestedCu": 2,
        "ResourceGroupIdentifier": "S_res_group_7708_1667792816832"
      },
      "RealtimeResourceSettings": {
        "RequestedCu": 2,
        "ResourceGroupIdentifier": "S_res_group_235454102432001_1579085295030"
      },
      "ScheduleResourceSettings": {
        "RequestedCu": 2,
        "ResourceGroupIdentifier": "S_res_group_235454102432001_1718359176885"
      }
    },
    "SourceDataSourceSettings": [
      {
        "DataSourceName": "dw_mysql",
        "DataSourceProperties": {
          "Encoding": "UTF-8\n",
          "Timezone": "GMT+8\n"
        }
      }
    ],
    "SourceDataSourceType": "Mysql",
    "TableMappings": [
      {
        "SourceObjectSelectionRules": [
          {
            "Action": "Include",
            "Expression": "mysql_table_1",
            "ExpressionType": "Exact",
            "ObjectType": "Table"
          }
        ],
        "TransformationRules": [
          {
            "RuleName": "rename_rule_1",
            "RuleActionType": "AddColumn",
            "RuleTargetType": "Table"
          }
        ]
      }
    ],
    "TransformationRules": [
      {
        "RuleActionType": "Rename",
        "RuleExpression": {
          "expression": "${srcDatasoureName}_${srcDatabaseName}"
        },
        "RuleName": "rename_rule_1",
        "RuleTargetType": "Table"
      }
    ],
    "JobStatus": "Running",
    "DIJobId": 32601
  }
}

Error codes

For a list of error codes, visit the Service error codes.

Change history

Change timeSummary of changesOperation
2025-03-27The response structure of the API has changedView Change Details
2025-01-06The request parameters of the API has changed. The response structure of the API has changedView Change Details
2024-10-15The internal configuration of the API is changed, but the call is not affectedView Change Details
2024-10-15The response structure of the API has changedView Change Details
2024-10-14The request parameters of the API has changedView Change Details