This topic describes how to import data from Elasticsearch to Log Service. After you import data to Log Service, you can query, analyze, and transform the data in the Log Service console.
Prerequisites
- An Elasticsearch cluster is available.
- A project and a Logstore are created. For more information, see Create a project and Create a Logstore.
Create a data import configuration
- Log on to the Log Service console.
- In the Import Data section, click the Data Import tab. Then, click Elasticsearch - Data Import.
- Select the project and Logstore. Then, click Next.
- Configure the parameters for the data import configuration.
- Preview data, configure indexes, and then click Next. By default, full-text indexing is enabled for Log Service. You can also configure field indexes based on collected logs in manual mode or automatic mode. To configure field indexes in automatic mode, click Automatic Index Generation. This way, Log Service automatically creates field indexes. For more information, see Create indexes.Important If you want to query and analyze logs, you must enable full-text indexing or field indexing. If you enable both full-text indexing and field indexing, the system uses only field indexes.
- Click Log Query. The system redirects to the search and analysis page of the Logstore. On the page, check whether the import task of Elasticsearch data is successful. Wait for approximately 1 minute. If you can query Elasticsearch data, the import task is successful.
View a data import configuration
After you create a data import configuration, you can view the configuration and related reports in the Log Service console.
- In the Projects section, click the project to which the data import configuration belongs.
- Find the Logstore to which the data import configuration belongs, choose , and then click the name of the data import configuration.
- On the Import Configuration Overview page, view the basic information and reports of the data import configuration.
What to do next
- Delete a data import configurationOn the Import Configuration Overview page, you can click Delete Configuration to delete the data import configuration.Warning After a data import configuration is deleted, it cannot be restored. Proceed with caution.
- Stop and restart the import task of a data import configurationAfter you create a data import configuration, Log Service creates an import task. On the Import Configuration Overview page, you can click Stop to stop the import task. After the import task is stopped, you can also restart the import task.Important After an import task is stopped, the task is in the stopped state for up to 24 hours. If the import task is not restarted during this period, the task becomes unavailable. If you restart an unavailable import task, errors may occur.
FAQ
Issue | Possible cause | Solution |
---|---|---|
An Elasticsearch connection error occurs during the preview. Error code: failed to connect. |
|
|
A timeout error occurs during the preview. Error code: preview request timed out. | The Elasticsearch index that you want to import contains no data or contains no data that meets the specified filter conditions. |
|
The log time displayed in Log Service is different from the actual time of imported data. | The time field is not specified in the data import configuration, or the specified time format or time zone is incorrect. | Specify a time field or specify a correct time format or time zone. For more information, see Create a data import configuration. |
After data is imported, the data cannot be queried or analyzed. |
|
|
The number of imported data entries is less than expected. | Data entries whose size is larger than 3 MB exist in Elasticsearch. You can view the data entries on the Data Processing Insight dashboard. | Reduce the size of each data entry whose size is larger than 3 MB. |
After incremental import is enabled, a large latency exists when new data is imported. |
|
|
Error handling
Item | Description |
---|---|
Communication with the Elasticsearch cluster is abnormal. | The import task pulls Elasticsearch data in scroll mode. The default keep-alive duration is 24 hours. If network connection errors occur or other errors that prevent normal communication with Elasticsearch occur, the import task is automatically retried. The other errors include user authentication errors. If the communication cannot be recovered within 24 hours, the scroll session information on Elasticsearch is deleted. As a result, the import task cannot be resumed even when the task is retried. The system reports the "No search context found" error. In this case, you can only re-create the import task. |
The Logstore does not exist. | The import task is retried at regular intervals. If you re-create the Logstore within 24 hours, the import task continues to read Elasticsearch data from the position at which the task stops reading data. Otherwise, the scroll session information on Elasticsearch is deleted, and you can only re-create the import task. |