You can upload log files to Object Storage Service (OSS) for storage. Then, you can import the log data from OSS to Log Service and perform supported operations on the data in Log Service. For example, you can query, analyze, and transform the data. You can import only the OSS objects that are no more than 5 GB in size to Log Service. If you want to import a compressed object, the size of the compressed object must be no more than 5 GB.
Prerequisites
- Log files are uploaded to an OSS bucket. For more information, see Upload objects.
- A project and a Logstore are created. For more information, see Create a project and Create a Logstore.
- Log Service is authorized to assume the AliyunLogImportOSSRole role to access your
OSS resources. You can complete the authorization on the Cloud Resource Access Authorization page.
If you use a RAM user, you must grant the PassRole permission to the RAM user. The following example shows a policy that you can use to grant the PassRole permission. For more information, see Create a custom policy and Grant permissions to a RAM user.
{ "Statement": [ { "Effect": "Allow", "Action": "ram:PassRole", "Resource": "acs:ram:*:*:role/aliyunlogimportossrole" } ], "Version": "1" }
Create a data import configuration
View the data import configuration
After you create the data import configuration, you can view the configuration details and related statistical reports in the Log Service console.
What to do next
On the Import Configuration Overview page, you can perform the following operations on the data import configuration:
- Modify the configuration
Click Modify Settings to modify the data import configuration. For more information, see Create a data import configuration.
- Delete the configuration
Click Delete Configuration to delete the data import configuration.Warning After the data import configuration is deleted, it cannot be recovered.
FAQ
Issue | Cause | Solution |
---|---|---|
I cannot select an OSS bucket when I create a data import configuration. | The AliyunLogImportOSSRole role is not assigned to Log Service. | Complete authorization based on the descriptions in the "Prerequisites" section of this topic. |
Data cannot be imported. | The sizes of some OSS objects exceed 5 GB. | Reduce the sizes of the OSS objects. |
After data is imported, I cannot query or analyze the data. | No indexes are configured, or configured indexes do not take effect. | Before you import data, we recommend that you configure indexes for the Logstore to which you want to import the data. For more information, see Configure indexes. After the issue occurs, you can reconfigure indexes for your Logstore. For more information, see Reindex logs for a Logstore. |
Archive objects cannot be imported. | Restore Archived Files is turned off. |
|
The Regular Expression Filter parameter is specified, but no data is collected. |
|
Reconfigure the Regular Expression Filter parameter. If the issue persists, the cause may be that a large number of OSS objects are stored in the OSS bucket. In this case, specify a more specific directory of the OSS objects to reduce the number of objects that are involved in traversing. |
Logs are imported, but no data is found in the Log Service console. | The log time goes beyond the data retention period of the Logstore. Expired data is deleted. | Check the time range for query and the data retention period of the Logstore. |
The extracted log time is used to query data, but no data is found for that time. | The specified time format is invalid. | Check whether the time format is supported by Java SimpleDateFormat. For more information, see Class SimpleDateFormat. |
An error occurred in parsing an OSS object that is in the Multi-line Text format. | The specified regular expression that is used to match the first line or the last line for a log is invalid. | Specify a valid regular expression. |
The import speed suddenly slows down. |
|
|