This topic describes the data types and parameters that DM Writer supports and how to configure it by using the code editor.

Notice DM Writer supports only exclusive resource groups for Data Integration, but not the default resource group or custom resource groups. For more information, see Create and use an exclusive resource group for Data Integration and Add a custom resource group.

Background information

DM Writer allows you to write data to tables stored in primary relational database management system (DM) databases. Specifically, DM Writer obtains data from a Data Integration reader, connects to a remote DM database by using Java Database Connectivity (JDBC), and then executes an INSERT INTO statement to write data to the DM database. DM Writer is a common writer for relational databases. To enable DM Writer to write data to a relational database, you must register the driver for the relational database.

DM Writer is designed for extract, transform, load (ETL) developers to import data from data warehouses to DM databases. Users such as database administrators can use DM Writer as a data migration tool.

DM Writer supports most data types in relational databases, such as numbers and characters. Make sure that your data types are supported.

Parameters

Parameter Description Required Default value
jdbcUrl The JDBC URL of the database. The format must be in accordance with official specifications. You can also specify the information of the attachment facility. The format varies based on the database type. Data Integration selects an appropriate driver for data reading based on the format.
  • Format for DM databases:jdbc:dm://ip:port/database
  • Format for Db2 databases: jdbc:db2://ip:port/database
  • Format for PPAS databases: jdbc:edb://ip:port/database
Yes N/A
username The username that you can use to connect to the database. Yes N/A
password The password that you can use to connect to the database. Yes N/A
table The name of the destination table. Yes N/A
column The columns in the destination table to which data is written. Separate the columns with a comma (,).
Note We recommend that you do not use the default setting.
Yes N/A
preSql The SQL statement to execute before the sync node is run. For example, you can clear outdated data before data synchronization. You can execute only one SQL statement.
Note If you specify multiple SQL statements in the code editor, the system may not execute them in the same transaction.
No N/A
postSql The SQL statement to execute after the sync node is run. For example, you can add a timestamp after data synchronization. You can execute only one SQL statement.
Note If you specify multiple SQL statements in the code editor, the system may not execute them in the same transaction.
No N/A
batchSize The number of data records to write at a time. Set this parameter to a proper value based on your actual needs. This can greatly reduce the interactions between Data Integration and the DM database on the network, and increase the throughput. However, an excessively large value may lead to the out of memory (OOM) error during the data synchronization process. No 1024

Configure DM Writer by using the code editor

The following example shows how to configure a sync node to write data to an DM database. For more information, see Create a sync node by using the code editor.
{
    "type": "job",
    "steps": [
        {
            "stepType": "oracle",
            "parameter": {
                "datasource": "aaa",
                "column": [
                    "PROD_ID",
                  "name"
                ],
                "where": "",
                "splitPk": "",
                "encoding": "UTF-8",
                "table": "PENGXI.SALES"
            },
            "name": "Reader",
            "category": "reader"
        },
        {
            "stepType": "rdbms",
            "parameter": {
                "connection": [
                    {
                        "jdbcUrl": "jdbc:dm://ip:port/database",
                        "table": [
                            "table"
                        ]
                    }
                ],
                "username": "username",
                "password": "password",
                "table": "table",
                "column": [
                   "id",
                  "name"
                ],
                "preSql": [
                    "delete from XXX;"
                ]
            },
            "name": "Writer",
            "category": "writer"
        }
    ],
    "version": "2.0",
    "order": {
        "hops": [
            {
                "from": "Reader",
                "to": "Writer"
            }
        ]
    },
    "setting": {
        "errorLimit": {
            "record": ""
        },
        "speed": {
            "concurrent": 2,
            "throttle": false
        }
    }
}
                
You can enable DM Writer to support a new database by using the following method:
  1. Go to the directory of DM Writer, namely, ${DATAX_HOME}/plugin/writer/DM Writer. In the directory, ${DATAX_HOME} indicates the main directory of Data Integration.
  2. Add the driver of your database to the drivers array in the plugin.json file in the DM Writer directory. DM Writer automatically selects an appropriate driver for connecting to a database.
    {
        "name": "RDBMS Writer",
        "class": "com.alibaba.datax.plugin.reader.RDBMS Writer.RDBMS Writer",
        "description": "useScene: prod. mechanism: Jdbc connection using the database, execute select sql, retrieve data from the ResultSet. warn: The more you know about the database, the less problems you encounter.",
        "developer": "alibaba",
        "drivers": [
            "dm.jdbc.driver.DmDriver",
            "com.ibm.db2.jcc.DB2Driver",
            "com.sybase.jdbc3.jdbc.SybDriver",
            "com.edb.Driver"
        ]
    }
  3. Add the package of the driver to the libs directory in the DM Writer directory.
    $tree
    .
    |-- libs
    |   |-- Dm7JdbcDriver16.jar
    |   |-- commons-collections-3.0.jar
    |   |-- commons-io-2.4.jar
    |   |-- commons-lang3-3.3.2.jar
    |   |-- commons-math3-3.1.1.jar
    |   |-- datax-common-0.0.1-SNAPSHOT.jar
    |   |-- datax-service-face-1.0.23-20160120.024328-1.jar
    |   |-- db2jcc4.jar
    |   |-- druid-1.0.15.jar
    |   |-- edb-jdbc16.jar
    |   |-- fastjson-1.1.46.sec01.jar
    |   |-- guava-r05.jar
    |   |-- hamcrest-core-1.3.jar
    |   |-- jconn3-1.0.0-SNAPSHOT.jar
    |   |-- logback-classic-1.0.13.jar
    |   |-- logback-core-1.0.13.jar
    |   |-- plugin-rdbms-util-0.0.1-SNAPSHOT.jar
    |   `-- slf4j-api-1.7.10.jar
    |-- plugin.json
    |-- plugin_job_template.json
    `-- RDBMS Writer-0.0.1-SNAPSHOT.jar