All Products
Search
Document Center

AnalyticDB:Supported data sources

Last Updated:Mar 28, 2026

AnalyticDB for MySQL can ingest data from databases, object storage, big data platforms, message queues, and local files into either a data warehouse or a data lake. The right import method depends on your data source and performance requirements.

How it works

AnalyticDB for MySQL supports two ingestion paths:

  • Data warehouse ingestion: Data is pre-processed before loading into AnalyticDB for MySQL's proprietary Xuanwu analytic storage engine. This path delivers high-throughput real-time writes and high-performance real-time queries, making it the right choice when query performance is your top priority.

  • Data lakehouse ingestion: Raw data is stored in open-source table formats (Iceberg and Paimon), either in ADB's built-in lake storage or in your own Object Storage Service (OSS) bucket. Because the data stays in open formats, it can be queried by both the Spark and XIHE engines of AnalyticDB for MySQL and by external engines such as MaxCompute. Use this path when your architecture requires open-source compatibility or multi-engine access. For higher read performance, enable LakeCache to reduce latency compared to reading directly from OSS.

Data lakehouse ingestion is available only for Enterprise Edition, Basic Edition, or Data Lakehouse Edition clusters.

Data warehouse ingestion

CategoryData sourceImport methodEditionDocumentation
DatabaseRDS MySQLExternal tableData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data from RDS MySQL using an external table
DatabaseRDS MySQLDTSData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data using DTS
DatabaseRDS MySQLDataWorksData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data using DataWorks
DatabaseRDS MySQLZero-ETLData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionSynchronize data using zero-ETL
DatabaseRDS SQL ServerDTSData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data using DTS
DatabaseRDS SQL ServerDataWorksData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data using DataWorks
DatabasePolarDB Distributed Edition (formerly DRDS)DTSData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data using DTS
DatabasePolarDB Distributed Edition (formerly DRDS)DataWorksData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data using DataWorks
DatabasePolarDB Distributed Edition (formerly DRDS)One-stop synchronizationEnterprise Edition, Basic Edition, or Data Lakehouse EditionAutomatically synchronize PolarDB-X metadata
DatabasePolarDB for MySQLFederated analyticsEnterprise Edition, Basic Edition, or Data Lakehouse EditionSynchronize data using the federated analytics feature
DatabasePolarDB for MySQLDTSData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data using DTS
DatabasePolarDB for MySQLZero-ETLData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionSynchronize data using zero-ETL
DatabaseMongoDBExternal tableEnterprise Edition, Basic Edition, or Data Lakehouse EditionImport data from MongoDB using an external table
DatabaseMongoDBZero-ETLData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionSynchronize data using zero-ETL
DatabaseLindormZero-ETLData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data from Lindorm
DatabaseOracleDataWorksData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data from Oracle
DatabaseSelf-managed MySQLExternal tableData Warehouse EditionImport data from a self-managed MySQL database
DatabaseSelf-managed HBaseDTSData Warehouse EditionImport data from a self-managed HBase cluster
StorageOSSExternal tableData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data from OSS using an external table
StorageOSSDataWorksData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data using DataWorks
StorageTablestoreExternal tableEnterprise Edition, Basic Edition, or Data Lakehouse EditionQuery and import data from Tablestore
StorageHDFSExternal tableData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data from HDFS using an external table
StorageHDFSDataWorksData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data using DataWorks
Big dataMaxComputeExternal tableData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data from MaxCompute using an external table
Big dataMaxComputeDataWorksData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data using DataWorks
Big dataFlinkFlinkData Warehouse EditionImport data from Flink
Message queueKafkaDataWorksData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport data using DataWorks
Message queueKafkaLogstash pluginData Warehouse EditionImport data using Logstash
Log dataLog dataData synchronizationData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionSynchronize log data using data synchronization
Log dataLog dataLogstash pluginData Warehouse EditionImport data using Logstash
Local dataLocal filesSQLAlchemyData Warehouse Edition, Enterprise Edition, Basic Edition, or Data Lakehouse EditionImport DataFrame data using SQLAlchemy
Local dataLocal filesLOAD DATAData Warehouse EditionImport data using LOAD DATA
Local dataLocal filesImport toolData Warehouse EditionImport data using the import tool
Local dataLocal filesKettleData Warehouse EditionImport data using Kettle

Data lakehouse ingestion

Important

Available only for Enterprise Edition, Basic Edition, or Data Lakehouse Edition clusters.

CategoryData sourceImport methodDocumentation
Message queueKafkaData synchronization (Recommended)Synchronize Kafka data using data synchronization
Log dataSimple Log Service (SLS)Data synchronization (Recommended)Synchronize SLS data using data synchronization
Big dataHiveData migrationImport data from Hive
StorageOSSMetadata discoveryImport data using metadata discovery

References

AnalyticDB for MySQL also supports asynchronous submission of import tasks. For more information, see Submit an asynchronous import task.