Logstash is an open source data collection engine with real-time pipelining capabilities. It is first used to write log data into Elastic Stack. As the open source community develops, Logstash can dynamically unify data from disparate sources and normalize the data into destinations of your choice.

For example, Java Database Connectivity (JDBC) is allowed to access AnalyticDB for MySQL. You can use a Logstash plug-in named logstash-output-jdbc to import log data to AnalyticDB for MySQL for subsequent analysis. However, JDBC writes single records. If you use JDBC to write large amounts of log data to AnalyticDB for MySQL, the write performance is low and the CPU utilization is high. To manage this issue, AnalyticDB for MySQL has optimized a Logstash plug-in named logstash-ouput-analyticdb based on JDBC. This plug-in is dedicated to write log data to AnalyticDB for MySQL in aggregate.

The logstash-output-analyticdb plug-in provides five times the write speed of the logstash-output-jdbc plug-in at a lower CPU utilization.

Install Logstash

The following section describes how to install the logstash-output-analyticdb plug-in. For information about how to install Logstash, visit Installing Logstash.

  1. Run the following command to go to the root directory of Logstash: cd logstash.
  2. Run the following command to install the logstash-output-analyticdb plug-in: bin/logstash-plugin install logstash-output-analyticdb.
  3. Run the following command to create a directory named vendor/jar/jdbc under the root directory of Logstash: mkdir -p vendor/jar/jdbc.
  4. Run the following command to copy the installation package of JDBC to vendor/jar/jdbc: cd vendor/jar/jdbc; wget http://central.maven.org/maven2/mysql/mysql-connector-java/5.1.36/mysql-connector-java-5.1.36.jar.

After you complete the preceding steps, the logstash-output-analyticdb plug-in is installed.

Configure Logstash

Create a configuration file named logstash-analyticdb.conf under the config directory. You can also customize the file name. logstash-analyticdb.conf contains the following content:

input
{
    stdin { }
}
output {
    analyticdb {
        driver_class => "com.mysql.jdbc.Driver"
        connection_string => "jdbc:mysql://HOSTNAME:PORT/DATABASE? user=USER&password=PASSWORD"
        statement => [ "INSERT INTO log (host, timestamp, message) VALUES(?, ?, ?)", "host", "@timestamp", "message" ]
        commit_size => 4194304
    }
}           
  • connection_string: the JDBC connection string used to connect to AnalyticDB for MySQL.
  • statement: the declared arrays in the INSERT statement.

Other parameters:

  • max_flush_exceptions: specifies the maximum number of retries that can be attempted if an exception occurs during data write. Default value: 100.
  • skip_exception: specifies whether to skip exceptions. The default value is FALSE, indicating that the maximum number of retries specified by the max_flush_exceptions parameter are attempted if an exception occurs. If all of these retries fail, an exception will be thrown to terminate the synchronization task. If this parameter is set to TRUE and the maximum number of retries fail, the exception is skipped and simply written to a log.
  • flush_size: the maximum number of data records that can buffer simultaneously. This parameter is used in combination with the commit_size parameter.
  • commit_size: the maximum amount of data that can buffer simultaneously. This parameter is used in combination with the flush_size parameter. Data write tasks are submitted when upper limits are reached.

The preceding configuration file is provided for reference. You must configure the logstash-analyticdb.conf file. For more information about configurations related to AnalyticDB for MySQL, visit README. For more information about configurations and rules of Logstash, see the Logstash documentation.

After you configure the preceding parameters, the configuration is complete.

Start a task

Run the following command in the installation directory of Logstash to start a task: bin/logstash -f config/logstash-analyticdb.conf.

Precautions

Before you write data into AnalyticDB for MySQL, we recommend that you run the following command to upgrade Logstash to the latest version:

bin/logstash-plugin update logstash-output-analyticdb