This topic describes how to use Function Compute to process logs in Alibaba Cloud Log Service. In the solution, Alibaba Cloud Log Service is connected to a web server of an e-commerce platform and a managed extract, transform, and load (ETL) program is written in Function Compute to process logs in Log Service. The processed data is displayed on the dashboard of Log Service as easy-to-read graphs and charts. The managed ETL program is called at a quasi-real-time frequency in seconds by using a Log Service trigger.
- Data processing: Consume logs that are written by Log Service in real time.
- Data shipping: Extract data from a data source and load the processed data to the data warehouse based on defined data warehouse models.
The following figure shows the architecture of the solution.
In the preceding figure:
- Logtail is used to parse collected logs on an on-premises machine and upload the parsed logs to the source Logstore.
- The source Logstore receives and saves the logs that are parsed and reported by Logtail in the machine group.
- A Log Service trigger is used to periodically trigger the managed ETL function that is predeployed in Function Compute. The function performs ETL operations on raw logs and writes the processed logs to the destination Logstore.
- The log that is processed by the managed ETL function is stored in a centralized manner. You can use Log Service to analyze or display the data.
- The procedure is a centralized solution to collect, store, transform, analyze, and display data.
- Fully managed log processing tasks are triggered at a regular basis and are automatically retried.
- Logstores can be scaled based on shards to meet the resource requirements of big data workloads.
- Features such as data processing, elastic resource provisioning, pay-as-you-go billing, and code customization are provided based on Function Compute.
- Built-in function templates are constantly added to make function development easier in most cases.