All Products
Search
Document Center

Simple Log Service:Overview

Last Updated:Jan 25, 2024

Logtail provides plug-ins to parse raw logs into structured data.

Logtail plug-ins for data processing are classified into native plug-ins and extended plug-ins.

  • Native plug-ins provide high performance and are suitable for most business scenarios. We recommend that you use native plug-ins.

  • Extended plug-ins provide more features. If you cannot process complex business logs by using native plug-ins, you can use extended plug-ins to parse logs. However, your system performance may be affected.

Limits

  • Limits on performance

    If you use an extended Logtail plug-in to process logs, Logtail consumes more resources. Most of these resources are CPU resources. You can modify the parameter settings of Logtail based on your business scenario. For more information, see Configure the startup parameters of Logtail. If raw logs are generated at a speed that is higher than 5 MB/s, we recommend that you do not use complicated combinations of plug-ins to process logs. You can use extended Logtail plug-ins to simplify data processing, and then use the data transformation feature to further process the logs.

  • Limits on log collection

    Extended plug-ins use the line mode to process text logs. In this mode, file-level metadata such as __tag__:__path__ and __topic__ is stored in each log. If you use extended plug-ins to process data, the following limits apply to tag-related features:

    • By default, the contextual query and LiveTail features are unavailable.

      If you want to use these features, you must add the aggregators configuration.

    • By default, the name of the __topic__ field is changed to __log_topic__.

      After you add the aggregators configuration, the __topic__ field and the __log_topic__ field exist in logs. If the __log_topic__ field is no longer used, you can use the processor_drop plug-in to drop the field.

    • Fields such as __tag__:__path__ do not have original field indexes. You must configure indexes for these fields.

Procedure

If you want to use Logtail plug-ins to parse logs, you can add the required plug-ins when you create or modify a Logtail configuration.

Add plug-ins when you modify a Logtail configuration

  1. Log on to the Simple Log Service console.

  2. In the Projects section, click the project that you want to manage.

  3. Choose Log Storage > Logstores. Click the > icon next to the Logstore that you want to manage. Then, choose Data Import > Logtail Configurations.

  4. In the Logtail Configurations list, click the Logtail configuration that you want to manage.

  5. On the Logtail Configuration page, click Edit.

  6. In the Processor Configurations section, add Logtail plug-ins and click Save.

Add plug-ins when you create a Logtail configuration

When you create a Logtail configuration, you can add Logtail plug-ins. In the Import Data section, click the On-premises Open Source/Commercial Software tab, select the required data source. Then, configure the settings in the Specify Logstore, Create Machine Group, and Machine Group Settings steps. In the Logtail Config or Specify Data Source step, add plug-ins. For more information, see Collect text logs from servers.

The Logtail plug-in configuration that you add when you create a Logtail configuration works in the same manner as the Logtail plug-in configuration that you add when you modify the Logtail configuration.

Logtail plug-ins for data processing

Important
  • Native plug-ins can be used to collect only text logs.

  • You cannot add native plug-ins and extended plug-ins at the same time.

  • If you add native plug-ins, you must take note of the following items:

    • You must add one of the following Logtail plug-ins for data parsing as the first plug-in: Data Parsing (Regex Mode), Data Parsing (Delimiter Mode), Data Parsing (JSON Mode), Data Parsing (NGINX Mode), Data Parsing (Apache Mode), and Data Parsing (IIS Mode).

    • After you add the first plug-in, you can add one Time Parsing plug-in, one Data Filtering plug-in, and multiple Data Masking plug-ins.

Native plug-ins

Plug-in name

Description

Data Parsing (Regex Mode)

Extracts log fields based on a regular expression and parses logs into key-value pairs. For more information, see Parsing in regex mode.

Data Parsing (JSON Mode)

Parses JSON logs into key-value pairs. For more information, see Parsing in JSON mode.

Data Parsing (Delimiter Mode)

Structuralizes and parses logs into key-value pairs based on a specific delimiter. For more information, see Parsing in delimiter mode.

Data Parsing (NGINX Mode)

Structuralizes NGINX logs and parse the logs into key-value pairs. For more information, see Parsing in NGINX mode.

Data Parsing (Apache Mode)

Structuralizes Apache logs and parses the logs into key-value pairs. For more information, see Parsing in Apache mode.

Data Parsing (IIS Mode)

Structuralizes IIS logs and parses the logs into key-value pairs. For more information, see Parsing in IIS mode.

Time Parsing

Parses log time. For more information, see Time parsing.

Data Filtering

Filters logs. For more information, see Data filtering.

Data Masking

Masks the sensitive content of logs. For more information, see Data masking.

Extended plug-ins

Feature

Description

Extract fields

Extracts fields by using a regular expression. For more information, see Regex mode.

Extracts fields by anchoring start and stop keywords. For more information, see Anchor mode.

Extracts fields in CSV mode. For more information, see CSV mode.

Extracts fields by using a single-character delimiter. For more information, see Single-character delimiter mode.

Extracts fields by using a multi-character delimiter. For more information, see Single-character delimiter mode.

Extracts fields by splitting key-value pairs. For more information, see Key-value pair mode.

Extracts fields by using Grok expressions. For more information, see Grok mode.

Add fields

Adds fields. For more information, see Add fields.

Drop fields

Drops fields. For more information, see Drop fields.

Rename fields

Renames fields. For more information, see Rename fields.

Encapsulate fields

Encapsulates one or more fields into a JSON object-formatted field. For more information, see Encapsulate fields.

Expand JSON fields

Expands JSON fields. For more information, see Expand JSON fields.

Filter logs

Uses regular expressions to match the values of log fields and filter logs. For more information, see processor_filter_regex.

Uses regular expressions to match the names of log fields and filter logs. For more information, see processor_filter_key_regex.

Extract log time

Parses the time field in raw logs and specify the parsing result as the log time. For more information, see Time format supported by Go.

Convert IP addresses

Converts IP addresses in logs to geographical locations. A geographical location includes the following information: country, province, city, longitude, and latitude. For more information, see Convert IP addresses.

Mask sensitive data

Replaces sensitive data in logs with specified strings or MD5 hash values. For more information, see Mask sensitive data.

Map field values

Maps field values. For more information, see Map field values.

Encrypt fields

Encrypts specific fields. For more information, see Encrypt fields.

Encode and decode data

Decodes field values. For more information, see Base64 decoding.

Encodes field values. For more information, see Base64 encoding.

Encodes data by using the MD5 algorithm. For more information, see MD5 encoding.

Convert logs to Simple Log Service metrics

Converts collected logs to SLS metrics. For more information, see Convert logs to Simple Log Service metrics.

Convert logs to Simple Log Service traces

Converts collected logs to SLS traces. For more information, see Convert logs to Simple Log Service traces.