Before using the log analysis service, you must pay for it. For the fees of log analysis service, see Log Service Pricing.

OSS users often need to analyze data about access logs and resource consumption, such as:

  • Usage of OSS storage, traffic, and requests
  • Logs generated during the lifecycle of a file (create, modify, delete)
  • Hot files, accesses to these files, and the traffic generated by the accesses
  • Errors and list of logs including error requests

You can use the log analysis function on the OSS console to analyze massive logs. This document describes how to activate and use the log analysis service on the OSS console.

Procedure

  1. Activate the log service.
    1. Log on to the OSS console.
    2. In the Data Processing area, find Log Analysis, move the mouse cursor to the Log Analysis icon, click Activate Log Service.
    3. On the activation page, select I Agree, and click Activate.
  2. Authorize the log service so that it can obtain data from OSS.
    1. On the OSS console, click Overview on the upper left corner to refresh the page. Move the mouse cursor to the Log Analysis icon, click Authorize Log Collection.
      Note Before authorization, you must click Overview on the OSS console to refresh the page.
    2. On the Cloud Resource Access Authorization page, confirm that the role to authorize is AliyunLogArchiveRole, click Authorize.
  3. Associate the log analysis service with a bucket.
    1. On the OSS console, click Overview on the upper left corner to refresh the page. Move the mouse cursor to the Log Analysis icon, click Manage Log Service.
      Note You must click Overview on the OSS console to refresh the page before managing the log service.
    2. On the Log Analysis page, click Create association.
    3. The Create Log Analysis Association page is displayed on the right. In Step 1, select Region, enter the Project name and Description (optional), and click Next.

      Pay attention to the following two points:

      • When selecting Region, you must select regions in which available buckets are created.
      • When selecting Project name, you must follow the following rules:
        • A project name can only include lower-case letters, numbers, and hyphens.
        • A project name must start and end with a lower-case letter or a number.
        • The length of a project name can be 3 to 63 characters.
    4. In Step 2, enter Log store name, select Data storage period and Partition (Shard) number, and click Next.

      Each item you enter and select is described below:

      • Log store name: The name of the log store, which must follow the following rules:
        • A log store name can only include lower-case letters, numbers, hyphens, and understores.
        • A log store name must start and end with a lower-case letter or a number.
        • The length of a log store name can be 3 to 63 characters.
      • Data storage period: Number of days that data is stored.
      • Partition (shard) number: For detailed information, see Partition.
    5. In Step 3, select Bucket to associate with and click Submit.
  4. Configure index information.
    1. Click Go to log service console to configure index information.
    2. If you do not have special requirements, you can keep the basic and default configuration and click Next.
      Note If you want to configure index information separately, see Index and query.
    3. Configure the log data delivery and ETL functions. If you do not need to deliver log data, click OK. If you want to deliver log data, click Enable delivery on the required delivery method and ETL function, and then click OK.
  5. Analyze logs.
    1. On the OSS console, move the mouse cursor to the Log Analysis icon, click Manage Log Service, as shown in the following figure:
    2. On the Log Analysis page, click Analyze logs.
    3. The log analysis page is displayed. You can view the log analysis result either in the database or in the dashboard.