This topic provides answers to some frequently asked questions about Alibaba Cloud Logstash.

How do I import data to or export data from Logstash over the Internet?

Logstash clusters are deployed in virtual private clouds (VPCs). You must configure NAT gateways to connect Logstash clusters to the Internet. For more information, see Configure a NAT gateway for data transmission over the Internet.

An error log is generated when a self-managed Kafka cluster is used as the source or destination of Logstash. What do I do?

  • Error log: No entry found for connection

    Cause: Logstash nodes cannot resolve the IP address that corresponds to the hostname of the Kafka service.

    Solution: Add the following configuration to server.properties. In this example, the Kafka service runs on port 9092 of the server whose IP address is 10.10.10.10.
    listeners=PLAINTEXT://10.10.10.10:9092
    advertised.listeners=PLAINTEXT://10.10.10.10:9092
    Notice
    • When you add the preceding configuration, you must replace 10.10.10.10:9092 with the IP address and port number of your Kafka cluster.
    • We recommend that you use Message Queue for Apache Kafka. Make sure that the IP addresses of your Logstash nodes are in the whitelist of your Kafka cluster.
  • Error log: could not be established. Broker may not be available

    Cause: The Kafka cluster does not exist or cannot be connected.

    Solution: Check whether the Kafka cluster runs as expected or whether the setting of bootstrap_servers is correct in the configurations of your Logstash pipeline.

Does the Logstash JDBC driver support MySQL databases?

Yes, the Logstash JDBC driver supports MySQL databases. You must make sure that the mysql-connector-java driver file is uploaded. For more information, see Configure third-party libraries.

Can Logstash nodes be monitored?

Yes, Logstash nodes can be monitored. You can enable the X-Pack Monitoring feature for your Logstash cluster and associate the cluster with an Elasticsearch cluster. Then, you can monitor the nodes in your Logstash cluster in the Kibana console of the Elasticsearch cluster. For more information, see Enable the X-Pack Monitoring feature.

Can I upload script files to Logstash?

No, you cannot upload script files to Logstash. You can use only the configuration file of your Logstash cluster to manage pipelines and implement data transmission. For more information, see Use configuration files to manage pipelines.

Can I specify HTTP as the data collection protocol for Logstash?

Yes, you can specify HTTP as the data collection protocol for Logstash. Logstash can receive single-line or multiline events over HTTP or HTTPS. For more information, see Http input plugin.
Note By default, Alibaba Cloud Logstash cannot connect to the Internet. If you want to collect HTTP requests over the Internet, you must configure a NAT gateway. For more information, see Configure a NAT gateway for data transmission over the Internet.

How do I use Logstash to synchronize data from Log Service to Elasticsearch?

You can use the logstash-input-sls plug-in. For more information, see Use the logstash-input-sls plug-in.

Can I use Logstash to synchronize data in real time?

Logstash is a near-real-time data synchronization tool. Logstash continues to write data to the destination unless you stop the related pipeline or the source does not have data that needs to be synchronized.

After I create a pipeline, the process is stuck, and the update progress of my Logstash cluster remains unchanged. What do I do?

Check whether an error log for the cluster is generated and identify the cause based on the error. For more information, see Query logs. The following table describes common causes of errors and solutions to the errors.
Cause Solution
The pipeline is incorrectly configured. Pause the update of the Logstash cluster, wait until the update is paused, and then modify the configuration of the pipeline to trigger a cluster restart. For more information, see View the progress of a cluster task.
The disk usage of the Logstash cluster is excessively high. Upgrade the configuration of the Logstash cluster. For more information, see Upgrade the configuration of a cluster. Then, refresh the details page of the Logstash cluster and check the update progress.
If the output of the pipeline is an Elasticsearch cluster, the Auto Indexing feature is disabled for the Elasticsearch cluster. Enable the Auto Indexing feature for the Elasticsearch cluster. For more information, see Configure the YML file. Then, refresh the details page of the Logstash cluster and check the update progress.
If a Beats shipper is specified in the input configuration of the pipeline, a port that is out of the range 8000 to 9000 is used. Pause the update of the Logstash cluster, wait until the update is paused, and then change the port number in the configuration of the pipeline to a value that ranges from 8000 to 9000 to trigger a cluster restart.
The public IP address of the source or destination is specified in the configuration of the pipeline. Use one of the following solutions:
  • Pause the update of the Logstash cluster, wait until the update is paused, and then change the public IP address to the private IP address of the source or destination.
  • Configure a NAT gateway to implement data transmission over the Internet. For more information, see Configure a NAT gateway for data transmission over the Internet. Then, refresh the details page of the Logstash cluster and check the update progress.
file_extend is specified in the configuration of the pipeline. However, the logstash-output-file_extend plug-in is not installed. Use one of the following solutions:
  • Install the logstash-output-file_extend plug-in. For more information, see Use the pipeline configuration debugging feature. Then, refresh the details page of the Logstash cluster and check the update progress.
  • Pause the update of the Logstash cluster, wait until the update is paused, and then remove file_extend from the configuration of the pipeline to trigger a cluster restart.