Data Lake Analytics (DLA) is a serverless cloud-native interactive search and analytics service. It allows you to use the Presto and Spark engines to analyze data from a variety of data sources. Quick Start helps you understand the basic procedure of using DLA and provides guidance for you to activate DLA, build a data lake, and use the Presto and Spark engines to analyze and compute data.
- Activate DLA.
- Build a data lake. You can use one of the following methods to build a data lake:
- Manually upload files to Object Storage Service (OSS). Then, use the metadata crawling feature to create tables to build a data lake. For more information, see Upload objects and OSS data sources.
- Use another service to deliver files to OSS. For example, use the ActionTrail console to deliver log files to OSS. Then, use the metadata crawling feature to create tables to build a data lake. For more information, see Create a single-account trail and OSS data sources.
- You can build a data lake through one-click data warehousing or by merging multiple databases. You can also build a real-time data lake based on databases and message logs. For more information, see One-click data warehousing, Create a data warehouse by merging databases, and Build a real-time data lake.
- Access data sources. You can use DLA to access OSS or other data sources to analyze and compute data. For more information, see Use the serverless Presto engine to access data sources and Use the serverless Spark engine to access data sources.
- Analyze and compute data. You can use the serverless Presto or Spark engine to analyze and compute data. For more information, see Serverless Presto and Serverless Spark.
- Implement data applications. You can use DataWorks or Data Management (DMS) to schedule DLA Presto and Spark tasks, and display the query and analysis results of OSS data as business intelligence (BI) reports. For more information, see Create Quick BI visualized reports.