All Products
Search
Document Center

AnalyticDB:Use Data Integration to migrate and batch synchronize data

Last Updated:Mar 28, 2026

Data Integration is a reliable, secure, cost-effective, elastic, and scalable data synchronization platform provided by Alibaba Cloud. It supports offline (full and incremental) data access channels across diverse network environments and more than 20 types of data sources. For the full list, see Supported data source types, Reader plug-ins, and Writer plug-ins.

Use cases

  • Import: Load processed data from an external source into AnalyticDB for PostgreSQL.

  • Export: Extract data from AnalyticDB for PostgreSQL to another data store for downstream processing.

Prerequisites

Before you begin, make sure you have the following:

For Data Integration:

  • An Alibaba Cloud account — required to access DataWorks and Data Integration.

  • MaxCompute activated — activating MaxCompute automatically creates a default MaxCompute data source and lets you log on to the DataWorks console.

  • A DataWorks workspace — used to organize your workflows, data, and synchronization tasks.

To create a synchronization task using a RAM user's credentials, grant the required permissions to that RAM user first. See Create a RAM user.

For AnalyticDB for PostgreSQL:

  • (Import only) The destination database and table created on your AnalyticDB for PostgreSQL instance, using the psql CLI client.

  • (Export only) An IP address whitelist configured on your AnalyticDB for PostgreSQL instance. See Add whitelist.

Import data

Step 1: Add a data source

In the DataWorks console, add the source data store as a data source. See Configure data sources for instructions.

Step 2: Configure a synchronization task

A synchronization task moves data from the source to AnalyticDB for PostgreSQL. DataWorks supports two configuration modes:

ModeBest for
Wizard modeFirst-time setup; guided, form-based configuration with no scripting required
Script modeCustom or advanced configurations; full control over Reader and Writer plug-in parameters

Wizard mode (6 steps):

  1. Create a data synchronization node.

  2. Specify the data source.

  3. Set AnalyticDB for PostgreSQL as the data destination.

  4. Configure field mappings between the source and destination tables.

  5. Set the maximum transmission rate and dirty data check rules.

  6. Configure scheduling attributes.

For detailed instructions, see Configure a synchronization task in wizard mode.

Script mode (7 steps):

  1. Create a data synchronization node.

  2. Import a template.

  3. Configure the Reader plug-in for the source.

  4. Configure the Writer plug-in, targeting your AnalyticDB for PostgreSQL instance.

  5. Configure field mappings between the source and destination tables.

  6. Set the maximum transmission rate and dirty data check rules.

  7. Configure scheduling attributes.

For detailed instructions, see Configure a synchronization task in script mode.

Export data

The export procedure mirrors the import procedure, with one key difference: set AnalyticDB for PostgreSQL as the data source instead of the data destination, and specify your target data store as the destination.

Start by adding an AnalyticDB for PostgreSQL connection in DataWorks. See Add an AnalyticDB for PostgreSQL connection, then follow the same wizard mode or script mode steps described in Import data.

What's next

For the complete DataWorks documentation, see DataWorks documentationDataWorks documentationDataWorks documentationDataWorks documentation.