All Products
Search
Document Center

Simple Log Service:Collect data from Beats and Logstash

Last Updated:Jun 04, 2024

This topic describes how to configure Logtail in the Simple Log Service console to collect data from Beats and Logstash.

Prerequisites

  • Logtail is installed on the server that you use to collect data from Beats and Logstash. Servers that run Linux support Logtail 0.16.9 or later. Servers that run Windows support Logtail 1.0.0.8 or later. For more information, see Install Logtail on a Linux server or Install Logtail on a Windows server.

  • Data is collected by using Logstash or Beats.

    • For more information about how to collect data from Logstash, visit Logstash-Lumberjack-Output.

    • For more information about how to collect data from Beats, visit Beats-Lumberjack-Output.

      The procedure in this topic describes how to use Packetbeat to collect data transmitted on the local network, and use the Logtail Lumberjack plug-in to upload the data to Simple Log Service. Data collected by using Packetbeat is sent to Logstash, as shown in the following sample script:

      output.logstash:
        hosts: ["127.0.0.1:5044"]

Background information

Logstash and Beats (such as MetricBeat, PacketBeat, Winlogbeat, Auditbeat, Filebeat, and Heartbeat) support the Lumberjack protocol. Therefore, Logtail can use the protocol to upload data that is collected by Beats and Logstash to Simple Log Service.

Note
  • You can configure multiple Lumberjack plug-ins, but these plug-ins cannot listen on the same port.

  • Lumberjack plug-ins support SSL. Data uploaded to Simple Log Service from Logstash must be encrypted by using SSL.

Procedure

  1. Log on to the Simple Log Service console.

  2. In the Import Data section, select Custom Data Plug-in.

  3. Select the project and Logstore. Then, click Next.

  4. In the Machine Group Configurations step, configure a machine group.

    1. Configure the Scenario and Installation Environment parameters based on your business requirements.

      Important

      You must configure the Scenario and Installation Environment parameters regardless of whether a machine group is available. The parameter settings affect subsequent configurations.

    2. Make sure that a machine group is displayed in the Applied Server Groups section and click Next.

      Machine group available

      Select a machine group from the Source Machine Group section.

      image

      No machine group available

      Click Create Machine Group. In the Create Machine Group panel, configure the parameters. You can set the Machine Group Identifier parameter to IP Address or Custom Identifier. For more information, see Create a custom identifier-based machine group or Create an IP address-based machine group.

      Important

      If you apply a machine group immediately after you create the machine group, the heartbeat status of the machine group may be FAIL. This issue occurs because the machine group is not connected to Simple Log Service. To resolve this issue, you can click Retry. If the issue persists, see What do I do if no heartbeat connections are detected on Logtail?

  5. In the Configure Data Source step, set the Configuration Name and Plug-in Configuration parameters, and then click Next.

    • inputs is required and is used to configure the data source settings for the Logtail configuration.

      Important

      You can specify only one type of data source in inputs.

    • processors is optional and is used to configure the data processing settings for the Logtail configuration to parse data. You can specify one or more processing methods.

      If your logs cannot be parsed based only on the setting of inputs, you can configure processors in the Plug-in Configuration field to add plug-ins for data processing. For example, you can extract fields, extract log time, mask data, and filter logs. For more information, see Use Logtail plug-ins to process data.

    Data from Beats and Logstash is in the JSON format. processor_anchor is configured to expand the JSON-formatted data.

    {
      "inputs": [
        {
          "detail": {
            "BindAddress": "0.0.0.0:5044"
          },
          "type": "service_lumberjack"
        }
      ],
      "processors": [
        {
          "detail": {
            "Anchors": [
              {
                "ExpondJson": true,
                "FieldType": "json",
                "Start": "",
                "Stop": ""
              }
            ],
            "SourceKey": "content"
          },
          "type": "processor_anchor"
        }
      ]
    }
                            

    Parameter

    Type

    Required

    Description

    type

    String

    Yes

    The type of the data source. Set the value to service_lumberjack.

    BindAddress

    String

    No

    The IP address and port of the server to which data can be sent by using the Lumberjack protocol. Default value: 127.0.0.1:5044. To enable access from other hosts in the LAN by using the Lumberjack protocol, set the value to 0.0.0.0:5044.

    V1

    Boolean

    No

    Specifies whether to use the Lumberjack protocol v1. Default value: false. Logstash supports the Lumberjack protocol v1.

    V2

    Boolean

    No

    Specifies whether to use the Lumberjack protocol v2. Default value: true. Beats support the Lumberjack protocol v2.

    SSLCA

    String

    No

    The path of the Certificate Authority that issues the signature certificate. Default value: null. If you use a self-signed certificate, you do not need to specify the parameter.

    SSLCert

    String

    No

    The path of the certificate. Default value: null.

    SSLKey

    String

    No

    The path of the private key that corresponds to the certificate. Default value: null.

    InsecureSkipVerify

    Boolean

    No

    Specifies whether to skip the SSL security check. Default value: false. This value indicates the SSL security check is performed.

  6. Create indexes and preview data. Then, click Next. By default, full-text indexing is enabled in Simple Log Service. You can also manually create field indexes for the collected logs or click Automatic Index Generation. Then, Simple Log Service generates field indexes. For more information, see Create indexes.

    Important

    If you want to query all fields in logs, we recommend that you use full-text indexes. If you want to query only specific fields, we recommend that you use field indexes. This helps reduce index traffic. If you want to analyze fields, you must create field indexes. You must include a SELECT statement in your query statement for analysis.

  7. Click Log Query. You are redirected to the query and analysis page of your Logstore.

    You must wait approximately 1 minute for the indexes to take effect. Then, you can view the collected logs on the Raw Logs tab. For more information, see Query and analyze logs.

Troubleshooting

If no data is displayed on the preview page or query page after logs are collected by using Logtail, you can troubleshoot the errors based on the instructions that are provided in What do I do if errors occur when I use Logtail to collect logs?

What to do next

After Logtail uploads data to Simple Log Service, you can view the data in the Simple Log Service console. The following content is the sample data uploaded to Simple Log Service.

_@metadata_beat:  packetbeat
_@metadata_type:  doc
_@metadata_version:  6.2.4
_@timestamp:  2018-06-05T03:58:42.470Z
__source__:  **. **. **.**
__tag__:__hostname__:  *******
__topic__:  
_beat_hostname:  bdbe0b8d53a4
_beat_name:  bdbe0b8d53a4
_beat_version:  6.2.4
_bytes_in:  56
_bytes_out:  56
_client_ip:  192.168.5.2
_icmp_request_code:  0
_icmp_request_message:  EchoRequest(0)
_icmp_request_type:  8
_icmp_response_code:  0
_icmp_response_message:  EchoReply(0)
_icmp_response_type:  0
_icmp_version:  4
_ip:  127.0.0.1
_path:  127.0.0.1
_responsetime:  0
_status:  OK
_type:  icmp