Logstash is an open source, server-side data processing pipeline that ingests data from multiple sources, transforms it, and sends it to a storage destination. Because AnalyticDB for MySQL supports native Java Database Connectivity (JDBC) access, you can use a Logstash output plugin to write log data directly into AnalyticDB for MySQL.
The community plugin logstash-output-jdbc writes one record at a time, which results in low throughput and high CPU usage at large data volumes. The logstash-output-analyticdb plugin addresses this by writing data in batches, delivering 5x higher throughput with lower CPU overhead.
Prerequisites
Before you begin, make sure you have:
Logstash installed. See Installing Logstash.
The AnalyticDB for MySQL connection details: host, port, database name, username, and password.
Install the plugin
Before writing data, upgrade the plugin to the latest version to get the most recent bug fixes and performance improvements.
bin/logstash-plugin update logstash-output-analyticdbGo to the Logstash root directory.
cd logstashInstall the
logstash-output-analyticdbplugin.bin/logstash-plugin install logstash-output-analyticdbCreate the JDBC driver directory and download the connector JAR.
mkdir -p vendor/jar/jdbc cd vendor/jar/jdbc wget http://central.maven.org/maven2/mysql/mysql-connector-java/5.1.36/mysql-connector-java-5.1.36.jar
Go to the Logstash root directory:
cd logstash.Install the logstash-output-analyticdb plugin:
bin/logstash-plugin install logstash-output-analyticdb.In the Logstash directory, create the
vendor/jar/jdbcdirectory:mkdir -p vendor/jar/jdbc.Download the JDBC JAR file to the
vendor/jar/jdbcdirectory:cd vendor/jar/jdbc; wget http://central.maven.org/maven2/mysql/mysql-connector-java/5.1.36/mysql-connector-java-5.1.36.jar.
The logstash-output-analyticdb plugin is now installed.
Configure the plugin
Create a configuration file named logstash-analyticdb.conf in the config directory (you can use a different name).
The following example reads from stdin and writes to AnalyticDB for MySQL:
input {
stdin { }
}
output {
analyticdb {
driver_class => "com.mysql.jdbc.Driver"
connection_string => "jdbc:mysql://<hostname>:<port>/<database>?user=<username>&password=<password>"
statement => [ "INSERT INTO log (host, timestamp, message) VALUES(?, ?, ?)", "host", "@timestamp", "message" ]
commit_size => 4194304
}
}Replace the placeholders with your actual values:
| Placeholder | Description |
|---|---|
<hostname> | AnalyticDB for MySQL endpoint |
<port> | Port number |
<database> | Target database name |
<username> | Database username |
<password> | Database password |
Plugin parameters
| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
driver_class | string | Yes | — | JDBC driver class. Use com.mysql.jdbc.Driver for MySQL-compatible endpoints. |
connection_string | string | Yes | — | JDBC connection URL for AnalyticDB for MySQL. |
statement | array | Yes | — | INSERT statement as an array. The first element is the SQL template; subsequent elements are the field names that map to the ? placeholders. |
commit_size | integer | No | — | Maximum data volume in bytes per batch. A write triggers when this limit or flush_size is reached, whichever comes first. |
flush_size | integer | No | — | Maximum number of records per batch. A write triggers when this limit or commit_size is reached, whichever comes first. |
max_flush_exceptions | integer | No | 100 | Maximum number of retries when a write exception occurs. |
skip_exception | boolean | No | false | Controls behavior after all retries are exhausted. If false, the program throws an exception and terminates. If true, the exception is skipped and written to a log. |
For all available parameters, see the plugin's README. For general Logstash configuration options, see the Logstash documentation.
Start the task
From the Logstash installation directory, run:
bin/logstash -f config/logstash-analyticdb.confLogstash starts reading from the configured input and writing batches to AnalyticDB for MySQL.