DataWorks is a platform that can be used to process and analyze large amounts of data in offline mode. It uses MaxCompute as a computing and storage engine to offer fully hosted services for visual workflow development, scheduling, and O&M. In DataWorks, tasks can be hosted and scheduled by time or dependency. This topic describes how to use DataWorks to schedule DLA tasks.
Task scheduling requires dependencies between tasks. In this topic, two DLA tasks are created in DataWorks to demonstrate the dependencies between tables and tasks, as shown in the following figure.
- Task 1
DataWorks executes the
o_orderstatus = 'F'statement to filter the finished orders in the orders table and then writes the finished orders into the
- Task 2
DataWorks executes the
o_totalprice > 10000statement to filter the orders whose total price is greater than USD 10,000 from the finished_orders table, and then writes these orders into the
The orders table is stored in the
oss://dlaossfile1/dla/ path, of which dla indicates the object and dlaossfile1 indicates the bucket. You
can click the orders.txt file in the preceding path to download source data. The empty files finished_orders.txt
and high_value_finished_orders.txt are stored in the
- The DLA, DataWorks, and OSS services are activated and deployed in the same region. In this topic, the three services are all deployed in the China (Hangzhou) region.
- A workspace is created. For more information, see Create a workspace. In this topic, the liujing_dataworks_test workspace is created.
- An endpoint is created in DLA. For more information, see Set an endpoint.
- Create an OSS schema in DLA by using the following statement:
CREATE SCHEMA dataworks_demo with DBPROPERTIES( CATALOG = 'oss', LOCATION = 'oss://dlaossfile1/dla/');
- Create the orders, finished_orders, and high_value_finished_orders table for the OSS
file in DLA by using the following statements:
- Statement to create the orders table
CREATE EXTERNAL TABLE IF NOT EXISTS orders ( O_ORDERKEY INT, O_CUSTKEY INT, O_ORDERSTATUS STRING, O_TOTALPRICE DOUBLE, O_ORDERDATE DATE, O_ORDERPRIORITY STRING, O_CLERK STRING, O_SHIPPRIORITY INT, O_COMMENT STRING ) ROW FORMAT DELIMITED FIELDS TERMINATED BY '|' STORED AS TEXTFILE LOCATION 'oss://dlaossfile1/dla/';
- Statement to create the finished_orders table
CREATE EXTERNAL TABLE IF NOT EXISTS finished_orders ( O_ORDERKEY INT, O_TOTALPRICE DOUBLE ) ROW FORMAT DELIMITED FIELDS TERMINATED BY '|' STORED AS TEXTFILE LOCATION 'oss://dlaossfile1/dla/finished_orders/';
- Statement to create the high_value_finished_orders table
CREATE EXTERNAL TABLE IF NOT EXISTS high_value_finished_orders ( O_ORDERKEY INT, O_TOTALPRICE DOUBLE ) ROW FORMAT DELIMITED FIELDS TERMINATED BY '|' STORED AS TEXTFILE LOCATION 'oss://dlaossfile1/dla/finished_orders/';
- Statement to create the orders table
- Add a DLA data source in DataWorks.
- Log on to the DataWorks console. In the left-side navigation pane, click Workspaces. On the page that appears, find your workspace and click Data Integration in the Actions column.
- On the Welcome to Data Integration page, click the Connection icon. In the left-side
navigation pane, click Data Source. On the page that appears, click New data source in the upper-right corner. In the Add data source dialog box, click the Data Lake Analytics(DLA) icon in Big Data Storage.
- In the Add Data Lake Analytics(DLA) data source dialog box, specify the parameters shown in the following figure.
The following table describes the parameters that you must specify.
Data Source Name The name of the data source. We recommend that you specify an informative name for easy management. Description The description of the data source. This parameter is optional. Connection Url The DLA endpoint in the
Address:Portformat. For more information about how to obtain
Address:Port, see t1916498.html#topic-2566550.
Database The name of the Object Storage Service (OSS) database that you create in DLA. In this topic, set this parameter to dataworks_demo. User name The username that is used to access DLA. Password The password that is used to access DLA.
- Modify the IP address whitelist for DLA in DataWorks.
DataWorks allows you to add only the DLA data sources whose IP addresses are specified in an IP address whitelist of DataWorks. Therefore, you must add the IP addresses or Classless Inter-Domain Routing (CIDR) blocks of your DLA region to the IP address whitelist.
China (Hangzhou) 100.64.0.0/8,188.8.131.52/24,184.108.40.206/24,220.127.116.11/24,18.104.22.168/24,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,184.108.40.206/24,220.127.116.11/24,18.104.22.168/24,22.214.171.124/24,126.96.36.199/24 China (Shanghai) 188.8.131.52/24,184.108.40.206/24,220.127.116.11/24,18.104.22.168/24,22.214.171.124/24,10.117.28.203,10.117.39.238,10.143.32.0/24,10.152.69.0/24,10.153.136.0/24,10.27.63.15,10.27.63.38,10.27.63.41,10.27.63.60,10.46.64.81,10.46.67.156,126.96.36.199/24,188.8.131.52/24,184.108.40.206/24,220.127.116.11/24,18.104.22.168/24,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,184.108.40.206/24,220.127.116.11/24,18.104.22.168,22.214.171.124,126.96.36.199,188.8.131.52,184.108.40.206,220.127.116.11,18.104.22.168,22.214.171.124,100.64.0.0/8 China (Shenzhen) 100.106.46.0/24,100.106.49.0/24,10.152.27.0/24,10.152.28.0/24,126.96.36.199/24,188.8.131.52/24,184.108.40.206/24,100.64.0.0/8,220.127.116.11/24,18.104.22.168/24,22.214.171.124/24 China (Hong Kong) 10.152.162.0/24,126.96.36.199/24,188.8.131.52/24,100.64.0.0/8,184.108.40.206/24,220.127.116.11/24,18.104.22.168/24,22.214.171.124/24,126.96.36.199/24 Singapore (Singapore) 100.106.10.0/24,100.106.35.0/24,10.151.234.0/24,10.151.238.0/24,10.152.248.0/24,188.8.131.52/24,184.108.40.206/24,220.127.116.11/24,100.64.0.0/8,100.106.10.0/24,100.106.35.0/24,10.151.234.0/24,10.151.238.0/24,10.152.248.0/24,18.104.22.168/24,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,184.108.40.206/24,220.127.116.11/24,18.104.22.168/24,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,184.108.40.206/24 Australia (Sydney) 220.127.116.11/24,18.104.22.168/24,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,100.64.0.0/8,184.108.40.206/24,220.127.116.11/24,18.104.22.168/24,22.214.171.124/24 China (Beijing) 100.106.48.0/24,10.152.167.0/24,10.152.168.0/24,126.96.36.199/24,188.8.131.52/24,184.108.40.206/24,220.127.116.11/24,100.64.0.0/8,18.104.22.168/24,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,184.108.40.206/24,220.127.116.11/24,18.104.22.168/24 US (Silicon Valley) 10.152.160.0/24,100.64.0.0/8,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24 US (Virginia) 184.108.40.206/24,220.127.116.11/24,18.104.22.168/24,100.64.0.0/8,22.214.171.124/24,126.96.36.199/24 Malaysia (Kuala Lumpur) 188.8.131.52/24,184.108.40.206/24,220.127.116.11/24,18.104.22.168/24,100.64.0.0/8,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24 Germany (Frankfurt) 184.108.40.206/24,220.127.116.11/24,18.104.22.168/24,22.214.171.124/24,126.96.36.199/24,100.64.0.0/8,188.8.131.52,184.108.40.206,220.127.116.11,18.104.22.168,22.214.171.124,126.96.36.199,188.8.131.52,184.108.40.206,220.127.116.11,18.104.22.168,22.214.171.124,126.96.36.199/24,188.8.131.52/24 Japan (Tokyo) 100.105.55.0/24,184.108.40.206/24,220.127.116.11/24,18.104.22.168/24,100.64.0.0/8,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,184.108.40.206/24,220.127.116.11/24 UAE (Dubai) 18.104.22.168/24,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,184.108.40.206/24,100.64.0.0/8 India (Mumbai) 220.127.116.11/24,18.104.22.168/24,22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,100.64.0.0/8,184.108.40.206/24,220.127.116.11/24 UK (London) 18.104.22.168/24,100.64.0.0/8 Indonesia (Jakarta) 22.214.171.124/24,126.96.36.199/24,188.8.131.52/24,184.108.40.206/24,100.64.0.0/8,220.127.116.11/24,10.143.32.0/24,18.104.22.168/24 China North 2 Ali Gov If the CIDR block 22.214.171.124/24,100.64.0.0/8 cannot be added, add the following IP addresses: 126.96.36.199, 188.8.131.52, 184.108.40.206, 220.127.116.11, 18.104.22.168, 22.214.171.124, 126.96.36.199, 188.8.131.52, 184.108.40.206, 220.127.116.11, 18.104.22.168, 22.214.171.124, 126.96.36.199, and 188.8.131.52.
- After you specify the preceding parameters, click Test connectivity. After the connectivity test is passed, click Complete.
- Create a DLA scheduling task in DataWorks.
- Log on to the DataWorks console. In the left-side navigation pane, click Workspaces. On the page that appears, find your workspace and click Data Analytics in the Actions column.
- In the left-side pane, right-click Business process and select New business process. In this topic, the dla_test_1 business process is created.
- Create a DLA task. In this topic, the finished_orders task is created.
In the left-side pane, unfold Business process and click the finished_orders task. On the finished_orders tab, unfold Customize and click Data Lake Analytics.Note You can repeat this step to create multiple tasks. In this topic, the finished_orders and high_value_finished_orders tasks are created.
- Run a DLA scheduling task in DataWorks.
- Task 1: DataWorks uses
o_orderstatus = 'F'to filter the finished orders in the orders table and then writes the finished orders into the finished_orders table.
insert into finished_orders select O_ORDERKEY, O_TOTALPRICE from orders where O_ORDERSTATUS = 'F';
- Task 2: DataWorks uses
o_totalprice > 10000to filter the orders whose total price is greater than USD 10,000 from the finished_orders table and writes these orders into the high_value_finished_orders table.
insert into high_value_finished_orders select * from finished_orders where O_TOTALPRICE > 10000;
- Task 1: DataWorks uses
What to do next
- Task configuration
In DataWorks, you can configure tasks to be triggered by time or dependency. You can also configure multiple tasks to be executed at a specified time based on specified dependencies.
To run the finished_orders task at 02:00 every day, specify the parameter shown in the following figure.
To run the high_value_finished_orders task after the finished_orders task is running normally, specify the time parameter shown in the following figure.
- Task publish
After you configure tasks, you can publish and maintain these tasks. For more information, see Deploy a node.