This topic describes how to analyze data in OSS logs by using Data Lake Analytics (DLA) after such data is exported to a specified OSS directory for long-term storage. You can also use Log Service (SLS) to store and analyze data in OSS logs.

Prerequisites:

  1. OSS is activated. For more information, see Activate OSS.

  2. A bucket is created. For more information, see Create a bucket.

  3. The logging function is enabled for the bucket, see Logging.

    Notes

    • After you enable and configure logging for a bucket, OSS generates an object on an hourly basis based on the predefined naming conventions. This way, OSS access logs are written to a specified bucket as objects.

    • If you enable the logging function for the first time, this function takes effect about one hour later.

Procedure

  1. Log on to the OSS console.

  2. In the left-side navigation pane, click Recommended Services. On the Recommended Services page, click the Data Lake Ecosystem tab, move your pointer over Log Analysis, and then click Console.

  3. In the Select the bucket that you want to configure log analysis dialog box, select a region from the Region drop-down list and a bucket from the Bucket drop-down list, and click OK.

  4. If the logging function is not enabled for the selected bucket, enable this function as prompted. Then, return to Step 2 in the OSS console.

  5. If the logging function is enabled for the selected bucket, click OK to redirect to the Execute page of DLA. On the Execute page, you can analyze data in OSS logs. For details about the descriptions of each log field, see Log file format.

FAQ

What do I do if the following error message appears in the DLA console during the execution of an SQL statement after the logging function is enabled for the first time?

The preceding error message indicates that the object that stores OSS logs has not been created in OSS. You can manually create an object in OSS. Alternatively, you can use the object that is automatically created by OSS one hour later.