When applications are running, various types of logs are generated. You can collect the desired log data and transfer the collected data to Alibaba Cloud Elasticsearch. Then, you can query and analyze the data. This topic provides an overview of best practices for log synchronization and analysis to meet your business requirements in various scenarios.
|Use Filebeat to collect Apache log data||The typical log collection mode of Elastic Stack is used. Use Alibaba Cloud Filebeat to collect Apache log data. Then, use Alibaba Cloud Logstash to filter the collected data and transfer the processed data to an Alibaba Cloud Elasticsearch cluster for queries and analysis.|
|Use the logstash-input-sls plug-in to obtain logs from Log Service||Use the logstash-input-sls plug-in to obtain logs from Log Service and transfer them to Alibaba Cloud Elasticsearch for queries and analysis.|
|Use user-created Filebeat to collect MySQL logs||Use self-managed Filebeat to collect and send MySQL logs to Alibaba Cloud Elasticsearch. Then, query, analyze, and present these logs in the Kibana console in a visualized manner.|
|Use Alibaba Cloud Elasticsearch and Rsbeat to analyze Redis slow logs in real time||Use Rsbeat to collect and send Redis slow logs to Alibaba Cloud Elasticsearch. Then, perform graphical analysis on the logs in the Kibana console.|