MaxCompute (previously known as ODPS) is a general purpose, fully managed, multi-tenancy data processing platform for large-scale data warehousing. MaxCompute supports various data importing solutions and distributed computing models, enabling users to effectively query massive datasets, reduce production costs, and guarantee data security.
Alibaba Cloud offers a platform called DataWorks for users to perform data ingestion, data processing and data management in MaxCompute. It provides fully hosted workflow services and a one-stop development and management interface to help enterprises mine and explore the full value of their data. DataWorks uses MaxCompute as its core computing and storage engine, to provide massive offline data processing, analysis, and mining capabilities.
Currently, data from the following data sources can be imported to or exported from the workspace through the data integration function: RDS, MySQL, SQL Server, PostgreSQL, MaxCompute, ApsaraDB for Memcache, DRDS, OSS, Oracle, FTP, dm, HDFS, and MongoDB.
In this document, the focus is on data ingestion from a MySQL database in ApsaraDB for RDS.
In this solution architecture, the user ingests data into a MaxCompute MaxCompute table from MySQL RDS, by using the web-based DataWorks platform.
You need the following information.
- An Alibaba Cloud Account.
- A sample dataset (click here to download).
- A sample database table.
- A MySQL ApsaraDB for RDS instance.
- A MySQL client (such as MySQL Workbench).
|MySQL,RDS||Region||Asia Pacific SE 3 (Kuala Lumpur)|
|MySQL Table||create table if not exists credit_data_mysql ( id char(48), income double, expenses double, credit_cards bigint) ;|
|Source csv file||sample_credit_data.csv|
|MaxCompute IP Whitelist (Malaysia Region)||18.104.22.168/24,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,100.64.0.0/8,184.108.40.206/24,220.127.116.11/24|
|DataWorks||Data source name||rds_demo_xxx|
|ODPS Table||create table if not exists credit_data_rds ( id string, income double, expenses double, credit_cards bigint) ;|
Create a database on your MySQL RDS instance by clicking on the Database section in the left navigation pane.
Click Create Database to create a MySQL database.
Name the MySQL database and select the authorized account for this database.
Next, to create a table in the MySQL Database, connect to your RDS instance with a MySQL client.
MySQL client references are listed as follows.
In this document, MySQL Workbench is used to connect to the MySQL RDS instance.
After connecting successfully to the MySQL RDS instance, create a table by using the DDL code as follows.
If you are new to MySQL Workbench, be aware that after copy-pasting the code as follows into the “Query 1” window, you need to double click on your database name in the left-side menu to select it (it is called “demo_db” here), and then click on the small lightning bolt icon in the top bar to run your code.
Create MySQL Table:
create table if not exists credit_data_mysql (
Upload sample source data (from a local CSV file) to the table (copy-paste the code as follows).
Note that you must change the file path in this example to match the location where you have saved sample_credit_data.csv.
Upload Local CSV:
LOAD DATA LOCAL INFILE '/Users/xiaomei/Desktop/sample_credit_data.csv'
INTO TABLE credit_data_mysql FIELDS TERMINATED BY ','
ENCLOSED BY '' LINES TERMINATED BY '\n'
Verify that the data has been successfully loaded into the table by running a SELECT statement (see as follows).
After running the SELECT query, you must see output that looks like this. Note that a row at the top says ID 0 0 0. This is because we did not remove the title row from our CSV file (this is also why we got a warning in the screenshot preceding). In general, you must make sure to remove any notes or title information before loading your data into RDS.
Check RDS Table:
SELECT * FROM credit_data_mysql;
In order for MaxCompute to be able to read data from this MySQL RDS instance, MaxCompute IPs need to be whitelisted.
This whitelisting is an important security measure to only allow known IPs to access the database. If you do not add MaxCompute to the RDS instance’s IP whitelist, MaxCompute cannot be able to connect to your database.
Find a list of the MaxCompute IPs which must be whitelisted in each region here:
In the MySQL RDS instance, click on Security on the left navigation pane, then click on Add a Whitelist Group.
Input the MaxCompute IPs to be whitelisted:
MaxCompute IP Whitelist(Malaysia Region) 18.104.22.168/24,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,100.64.0.0/8,184.108.40.206/24,220.127.116.11/24
The whitelisted group of MaxCompute IPs are displayed here:
Navigate to DataWorks > Data Integration.
On the Data Integration page, click on New Source to sync data from RDS.
Select MySQL Relational Database as the data source.
Configure the MySQL RDS data source information (sample values are shown as follows).
Data source name rds_demo_credit MySQL RDS ID rm-zf* Account UID 5903** Database Name demo_db Username ** Password **
After that, click test connectivity to check whether DataWorks can connect to the MySQL RDS database.
If the connection is successful, a green box appear in the upper right corner saying “connectivity test successfully”.
After that, click on Complete.
In DataWorks under Data Integration, click on Data Sources in the left navigation pane and the newly created data source from RDS is visible there.
Go to Sync Tasks in the left navigation pane within Data Integration and click on New Source.
Click on Wizard Mode to set up data ingestion from RDS.
The data source is a RDS data source that was created in preceding.
The table is a table that was created in the MySQL RDS database – credit_data_mysql.
Click on data preview to preview the data and verify that it is correct, then click Next.
Choose odps_first (odps) as the data ingestion target.
The odps_first is the default data repository for MaxCompute.
Before data can be ingested into MaxCompute, a table has to be created within MaxCompute.
Click on Create New Target Table.
Enter the table creation statement (see as follows – you may need to replace the code that MaxCompute automatically fills in).
create table if not exists credit_data_rds (
Click on Create Table and then click Next.
It is important to make sure the order of columns in the source data map correctly onto the columns of the MaxCompute table.
The recommended approach is to make sure the source data’s column order is the same as the order of the columns in the MaxCompute table.
Click on peer mapping to map source to target.
Click Next once the column mapping is correct.
In this tutorial the columns of the source data file and columns of the MaxCompute table are in the same order, so straight lines appear from source to target. In this case, it is not necessary to change the peer mapping.
Verify that Maximum Operating Rate and Concurrent Jobs appear as follows, then click Next.
Verify configuration and if everything is correctly configured, click Save.
Give the data integration task a name (here we choose datasync_rds).
After you have saved, click on operation to initiate data ingestion from RDS.
Monitor the log near the bottom of the screen to check the status of the data synchronization task.
If the data synchronization ended with return code: , then data ingestion was successful.
Click on Data Development in the DataWorks top menu, and select the table which data has been ingested into. You can then click on Create Script to create a script (query) which can be run against your MaxCompute table. The script name can be anything you like.
After creating a script, copy-paste the code SELECT * FROM credit_data_rds; (your table name may be different) and click on Run to generate results (see as follows).
The result is displayed in the log tab.
Validate the data by comparing it against the original data in the RDS database.
Ingesting data from ApsaraDB for RDS using the DataWorks IDE is user friendly and easy, can be done end-to-end using a web-based approach, and enables customers - especially business users – to quickly import data, saving time for more important tasks such as data analysis.
|Products||Product Links for Reference|