All Products
Search
Document Center

AnalyticDB:Supported data sources

Last Updated:Oct 24, 2025

AnalyticDB for MySQL supports importing data from various data sources, such as RDS MySQL, MongoDB, OSS, MaxCompute, and Kafka, into a data warehouse or a data lake. The import methods vary depending on the data source. Use this document to select an appropriate import method.

Overview

The differences between ingesting data into a data warehouse and a data lake are as follows:

  • Data ingestion into a data warehouse:

    • Data is pre-processed and then imported into the data warehouse.

    • The data warehouse uses the proprietary Xuanwu analytic storage engine developed by AnalyticDB for MySQL. This storage engine provides enterprise-grade data storage that is highly reliable, highly available, high-performance, and cost-effective. This engine enables AnalyticDB for MySQL to support high-throughput real-time writes and high-performance real-time queries.

    • Ingesting data into a data warehouse is suitable for business scenarios that require high performance for data analytics.

  • Data ingestion into a data lake:

    • Raw data is imported into the data lake in open source table formats, such as Iceberg and Paimon.

    • You can use the lake storage provided by AnalyticDB for MySQL or your own OSS bucket as the data lake storage. Because the data lake is open source and stores data in open formats such as Iceberg and Paimon, the data can be accessed by both the Spark and XIHE engines of AnalyticDB for MySQL and external engines such as MaxCompute.

    • Ingesting data into a data lake is suitable for business scenarios that require open source solutions and do not have strict requirements for analytics performance. If you require high access performance for your data lake, you can enable LakeCache to achieve higher bandwidth and lower latency compared to OSS.

Data ingestion into a data warehouse

Category

Data source

Import method

Product edition

Documentation

Database

RDS MySQL

Appearance

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data from RDS MySQL using an external table

DTS

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data using DTS

DataWorks

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data using DataWorks

Seamless integration

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Synchronize data using seamless integration (Zero-ETL)

RDS SQL Server

DTS

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data using DTS

DataWorks

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data using DataWorks

PolarDB Distributed Edition (formerly DRDS)

DTS

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data using DTS

DataWorks

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data using DataWorks

One-stop synchronization

Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Automatically synchronize PolarDB-X metadata

PolarDB for MySQL

Federated analytics

Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Synchronize data using the federated analytics feature

DTS

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data using DTS

Seamless integration

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Synchronize data using seamless integration (Zero-ETL)

MongoDB

Appearance

Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data from MongoDB using an external table

Seamless integration

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Synchronize data using seamless integration (Zero-ETL)

Lindorm

Seamless integration

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data from Lindorm

Oracle

DataWorks

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data from Oracle

Self-managed MySQL

Appearance

Data Warehouse Edition

Import data from a self-managed MySQL database

Self-managed HBase

DTS

Data Warehouse Edition

Import data from a self-managed HBase cluster

Storage

OSS

Appearance

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data from OSS using an external table

DataWorks

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data using DataWorks

Tablestore

Appearance

Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Query and import data from Tablestore

HDFS

Appearance

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data from HDFS using an external table

DataWorks

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data using DataWorks

Big data

MaxCompute

Appearance

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data from MaxCompute using an external table

DataWorks

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data using DataWorks

Flink

Flink

Data Warehouse Edition

Import data from Flink

Message queue

Kafka

DataWorks

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import data using DataWorks

Logstash plugin

Data Warehouse Edition

Import data to Data Warehouse Edition using Logstash

Log data

Log data

Data synchronization

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Synchronize SLS data to Data Warehouse Edition using the data synchronization feature

Logstash plugin

Data Warehouse Edition

Import data to Data Warehouse Edition using Logstash

Local data

SQLAlchemy

Data Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse Edition

Import DataFrame data using SQLAlchemy

LOAD DATA

Data Warehouse Edition

Import data to Data Warehouse Edition using LOAD DATA

Import tool

Data Warehouse Edition

Import data to Data Warehouse Edition using the import tool

Kettle

Data Warehouse Edition

Import data to Data Warehouse Edition using Kettle

Data ingestion into a data lake

Important

This feature is available only for Enterprise Edition, Basic Edition, or Data Lakehouse Edition clusters.

Category

Data source

Import method

Documentation

Message queue

Kafka

Data synchronization

Synchronize Kafka data using the data synchronization feature (Recommended)

Log data

Simple Log Service (SLS)

Data synchronization

Synchronize SLS data using the data synchronization feature (Recommended)

Big data

Hive

Data migration

Import data from Hive

Storage

OSS

Metadata discovery

Import data using metadata discovery

References

AnalyticDB for MySQL also supports the asynchronous submission of data import tasks. For more information, see Submit an asynchronous import task.