Logstash is an open source, server-side data processing pipeline. It can collect data from multiple data sources at the same time, convert the data, and then write the data to specified storage. AnalyticDB for MySQL is fully compatible with MySQL. You can write data from data sources supported by Logstash input plug-ins into AnalyticDB for MySQL. This topic describes how to use the logstash-input-kafka plug-in to write Apache Kafka data into AnalyticDB for MySQL.
- Input plug-ins that are used to collect data of various types and in different sizes from disparate sources
In common business scenarios, data is stored in a variety of formats across multiple systems in a centralized or scattered manner. Logstash supports multiple data input modes to collect data from disparate data sources at the same time. Logstash can collect data from logs, metrics, web applications, data storage, and Amazon services in a continuous streaming manner.
- Filter plug-ins that are used to parse and convert data in real time
Logstash uses filters to parse all types of events and identify defined fields to construct schemas. Then, it converts the schemas into common data types and transmits data to destination repositories, enabling you to analyze and make the most out of data in an easy and efficient manner.
- Use Grok to parse unstructured data into structured data.
- Parse geographic information from IP addresses.
- Anonymize personally identifiable information (PII) to completely exclude sensitive fields.
- Simplify overall processing without being affected by data sources, formats, or architectures.
- Output plug-ins that are used to export data
In addition to AnalyticDB for MySQL, Logstash provides several data output destinations.
Apache Kafka is a distributed service that can publish and subscribe to logs with high throughput. It provides high availability, high performance, distributed architecture, high scalability, and durability. Apache Kafka is widely used in major companies. It can be integrated with Logstash, eliminating the need for repeated construction.
For more information, see Use Logstash to write Apache Kafka data to AnalyticDB for MySQL.