This topic describes how to create, view, modify, and delete Logtail configurations for log collection in the Simple Log Service console. In addition to console operations, Simple Log Service also supports API and SDK methods.
Overview of Logtail configurations
Logtail configurations define the core rules for how to collect and process log data. Their purpose is to enable efficient log collection, structured parsing, filtering, and processing through flexible configuration.
Logtail configurations in the console comprise three parts:
Global configurations are used to organize collected text logs into different log topics.
Input configurations are used to define the collection details.
Processor configurations are used to transform raw logs into structured data.
Create a Logtail configuration
Log on to the Simple Log Service console. In the Projects section, click the one you want.

Find your logstore, and select . Click Integrate Now. In this example, Regular Expression - Text Log is used, which means text logs will be parsed using regular expression matching.

Select Servers and ECS. Select the machine group you created earlier, and click the > button to add it to the applied machine group. Then click Next. If no machine group is available, see Create a machine group.

In Global Configurations, enter the configuration name. In Other Global Configurations, set the log topic.
The log topic configuration items are described as follows: For detailed parameters, see Global configuration parameters.Machine Group Topic: If you select it, you must configure it when creating a machine group.
File Path Extraction: If you select it, you must configure the regular expression.
Custom: If you select it, you must enter
customized:// + custom topic nameto use a custom static log topic.
In Input Configurations, configure the File Path, which represents the path for log collection. The log path must start with a forward slash (/), such as
/data/wwwlogs/main/**/*.Log, which indicates files with the .Log suffix in the/data/wwwlogs/maindirectory. To set the maximum depth of the log directory to be monitored (that is, the maximum directory depth that the wildcard**in the File Path can match), modify the value of the maximum directory monitoring depth. A value of 0 specifies that only the specified log file directory is monitored. For detailed parameters, see Input configuration parameters.
In Processor Configurations, set Log Sample, Multi-line Mode, and Processing Method.

We recommend that you add a sample log in the Log Sample field. Sample logs can help you easily configure log processing-related parameters. If you configure this field, use a sample log from an actual collection scenario.
Turn on Multi-line Mode as needed. A multi-line log spans multiple consecutive lines. If you turn it off, Simple Log Service collects logs in single-line mode. Each log is placed in a line. If you turn it on, configure the following parameters:
Type
Custom: If the format of the raw logs is not fixed, configure Regex To Match First Line to identify the first line of each log. For example, we can use the regular expression
\[\d+-\d+-\w+:\d+:\d+,\d+]\s\[\w+]\s.*to split the five lines of raw data in the example into two logs. Note that the value of the Regex to Match First Line parameter must match the entire line of data.[2023-10-01T10:30:01,000] [INFO] java.lang.Exception: exception happened at TestPrintStackTrace.f(TestPrintStackTrace.java:3) at TestPrintStackTrace.g(TestPrintStackTrace.java:7) at TestPrintStackTrace.main(TestPrintStackTrace.java:16) [2023-10-01T10:31:01,000] [INFO] java.lang.Exception: exception happenedMulti-line JSON: If the raw logs are in standard JSON format, set Type to Multi-line JSON. Logtail automatically processes the line feeds that occur within a JSON-formatted log.
Processing Method If Splitting Fails:
Discard: Discards the text.
Retain Single Line: Saves each line of the text as a log.
Processors: Set Processing Method to Processors. A processor is configured to split logs. In this example, Logtail collects text logs in full regex mode, and a Data Parsing (Regex Mode) processor is automatically generated. You can use other processors as needed.
The following describes common processors. For more processor capabilities such as time parsing, filtering, and data masking, see Processing plugins. Simple Log Service also provides SPL-based data processing, which features higher processing efficiency while implementing functions similar to traditional processors. For more information, see Use Logtail SPL to parse logs.
Data Parsing (Regex Mode)
Click Data Parsing (Regex Mode) to enter the processor configuration page.

Configure the regular expression and specify keys based on the extracted values. Click Generate under Regular Expression. Then select content in the log sample and click Generate Regular Expression to automatically generate a regular expression for the selected content.

After the regular expression is generated, specify keys based on the extracted values in the Extracted Field. These key-value pairs can be used to create indexes.

For more information, see Data Parsing (Regex Mode).
Data Parsing (JSON Mode)
ImportantTo process collected JSON logs, add a Data Parsing (JSON Mode) processor. JSON logs can be in object or array format. An object log contains key-value pairs, and an array log has an ordered list of values. The Data Parsing (JSON Mode) processor can parse object-type JSON logs and extract key-value pairs from the first layer. The extracted keys become field names, and the values become field values. The processor cannot parse JSON logs of the array type. For more granular processing, see Extended plugin: expand JSON fields.
Turn on Multi-line Mode as needed. If you turn it on, do as follows:
Set Type to Multi-line JSON.
Set Processing Method If Splitting Fails to Retain Single Line.

Delete Data Parsing (Regex Mode) from the Processing Method list and add Data Parsing (JSON Mode).

The following table describes the parameters of Data Parsing (JSON Mode):
Parameter name
Description
Original Field
The original field that stores log content before parsing. Default value: content.
Retain Original Field if Parsing Fails
If selected, the original field is retained when parsing fails.
Retain Original Field if Parsing Succeeds
If selected, the original field is retained when parsing succeeds.
New Name of Original Field
After you select Retain Original Field if Parsing Fails or Retain Original Field if Parsing Succeeds, rename the original field storing the raw log content.
For more information, see Data Parsing (JSON Mode).
Data Parsing (Delimiter Mode)
NoteUse a Data Parsing (Delimiter Mode) processor to parse logs into multiple key-value pairs based on a specific delimiter.
Delete the Data Parsing (Regex Mode) processor from the Processing Method list and add a Data Parsing (Delimiter Mode) processor.

The following table describes the parameters of the Data Parsing (Delimiter Mode) processor.
Parameter
Description
Original Field
The original field that stores log content before parsing. Default value: content.
Delimiter
The delimiter based on which you want to extract log fields. Select a delimiter based on the actual log content, such as Vertical Bar (|).
NoteIf you select Non-printable Character as the delimiter, find the hexadecimal value of the invisible character in the ASCII table and enter it in the format
0x<hexadecimal value of the invisible character in the ASCII table>. For example, the first invisible character in the ASCII table is 0x01.Quote
When log field content contains delimiters, you must specify quotes to wrap the content. Content wrapped in quotes will be parsed by Simple Log Service as a complete field. You must select a quote based on the format of logs that you want to collect.
NoteIf you select Non-printable Character as the quote, find the hexadecimal value of the invisible character in the ASCII table and enter it in the format
0x<hexadecimal value of the invisible character in the ASCII table>. For example, the first invisible character in the ASCII table is 0x01.Extracted Field
If you configure a log sample, Simple Log Service extracts log content based on the log sample and the delimiter you select, and defines the content as values. Specify a key for each value.
If you do not specify a sample log, the Value column is unavailable. Specify keys based on the actual logs and the delimiter.
A key can contain only letters, digits, and underscores (_) and must start with a letter or an underscore (_). A key can be up to 128 bytes in length.
Allow Missing Field
Specifies whether to upload logs to Simple Log Service if the number of values extracted from logs is less than the number of keys. If you select Allow Missing Field, the logs are uploaded.
For example, if the log is
11|22|33|44, the delimiter is the vertical bar (|), and the keys areA,B,C,D, andE.If you select Allow Missing Field, the value of the
Efield is empty, and the log is uploaded to Simple Log Service.Otherwise, the log is discarded.
NoteLinux Logtail 1.0.28 and later or Windows Logtail 1.0.28.0 and later support the Allow Missing Field parameter for delimiter mode configuration.
Processing Method of Field to which Excess Part is Assigned
The method for handling extra values when the number of extracted values exceeds the specified keys. Valid values:
Expand: Retains the excess values and adds them to fields in the
__column$i__format, where$irepresents the sequence number of the excess field, starting from 0. Examples:__column0__and__column1__.Retain: Retains the excess values and adds them to a field named
__column0__.Discard: Discards the excess values.
Retain Original Field if Parsing Fails
Retains the original field when parsing fails.
Retain Original Field if Parsing Succeeds
Retains the original field when parsing succeeds.
New Name of Original Field
After you select Retain Original Field if Parsing Fails or Retain Original Field if Parsing Succeeds, rename the original field storing the raw log content.
For more information, see Data Parsing (Delimiter Mode).
Data Parsing (Apache Mode)
NoteUse a Data Parsing (Apache Mode) processor to parse Apache logs into structured data based on the log format that you specify in the Apache configuration file. A log is parsed into multiple key-value pairs.
Procedure
Delete the Data Parsing (Regex Mode) processor from the Processing Method list and add a Data Parsing (Apache Mode) processor.

The following table describes the parameters of the Data Parsing (Apache Mode) processor.
Parameter name
Description
Log Format
Select the log format defined in the Apache configuration file, such as common, combined, or custom.
APACHE LogFormat Configuration
The log configuration section specified in the Apache configuration file. In most cases, the section starts with LogFormat.
When you set Log Format to common or combined, the configuration fields of the corresponding format are automatically filled. Confirm whether the format is consistent with the format defined in the Apache configuration file.
When you set Log Format to Custom, fill in this field as needed, for example,
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\" %D %f %k %p %q %R %T %I %O" customized.
Original Field
The original field storing the log content before parsing. Default value: content.
Regular Expression
The regular expression extracting Apache logs. Simple Log Service automatically generates this regular expression based on the content in the APACHE LogFormat Configuration.
Extracted Field
Automatically generates log fields (keys) based on the content in the APACHE LogFormat Configuration.
Retain Original Field if Parsing Fails
Retains the original field when parsing fails.
Retain Original Field if Parsing Succeeds
Retains the original field when parsing succeeds.
New Name of Original Field
After you select Retain Original Field if Parsing Fails or Retain Original Field if Parsing Succeeds, rename the original field storing the raw log content.
For more information, see Data Parsing (Apache Mode).
Data Parsing (NGINX Mode)
NoteUse a Data Parsing (NGINX Mode) processor to parse NGINX logs into structured data based on the log format that you specify in the NGINX configuration file. A log is parsed into multiple key-value pairs.
Delete the Data Parsing (Regex Mode) processor from the Processing Method list, then add a Data Parsing (NGINX Mode) processor.

The following table describes the parameters of the Data Parsing (NGINX Mode) processor.
Parameter name
Note
NGINX Log Configuration
The log configuration section in the Nginx configuration file starts with log_format. For example:
log_format main '$remote_addr - $remote_user [$time_local] "$request" ' '$request_time $request_length ' '$status $body_bytes_sent "$http_referer" ' '"$http_user_agent"';For more information, see Introduction to Nginx logs.
Original Field
The original field that stores the log content before parsing. Default value: content.
Regular Expression
The regular expression that is used to extract NGINX logs. Simple Log Service automatically generates this regular expression based on the content in NGINX Log Configuration .
Extracted Field
Automatically extracts the corresponding log fields (keys) based on the NGINX Log Configuration.
Retain Original Field if Parsing Fails
Retains the original field when parsing fails.
Retain Original Field if Parsing Succeeds
Retains the original field when parsing succeeds.
New Name of Original Field
After you select Retain Original Field if Parsing Fails or Retain Original Field if Parsing Succeeds, rename the original field storing the raw log content.
For more information, see Data Parsing (NGINX Mode).
Data Parsing (IIS Mode)
NoteUse a Data Parsing (IIS Mode) processor to parse IIS logs into structured data based on the log format that you specify in the IIS configuration file. A log is parsed into multiple key-value pairs.

The following table describes the parameters of the Data Parsing (IIS Mode) processor.
Parameter name
Note
Log Format
Select the log format used by your IIS server logs.
IIS: Microsoft IIS log file format.
NCSA: NCSA common log file format.
W3C: W3C extended log file format.
IIS Configuration Fields
The IIS configuration fields:
If you set Log Format to IIS or NCSA, the system automatically specifies the IIS configuration fields.
If you set Log Format to W3C, specify the content of the
logExtFileFlagsparameter in the IIS configuration file. Example:logExtFileFlags="Date, Time, ClientIP, UserName, SiteName, ComputerName, ServerIP, Method, UriStem, UriQuery, HttpStatus, Win32Status, BytesSent, BytesRecv, TimeTaken, ServerPort, UserAgent, Cookie, Referer, ProtocolVersion, Host, HttpSubStatus"Default path of the IIS5 configuration file:
C:\WINNT\system32\inetsrv\MetaBase.bin.Default path of the IIS6 configuration file:
C:\WINDOWS\system32\inetsrv\MetaBase.xml.Default path of the IIS7 configuration file:
C:\Windows\System32\inetsrv\config\applicationHost.config.
Original Field
The original field that stores the log content before parsing. Default value: content.
Regular Expression
The regular expression that is used to extract IIS logs. Simple Log Service automatically generates this regular expression based on the content in the IIS Configuration Fields.
Extracted Field
Automatically generates log fields (keys) based on the content in the IIS Configuration Field.
Retain Original Field if Parsing Fails
Retains the original field when parsing fails.
Retain Original Field if Parsing Succeeds
Retains the original field when parsing succeeds.
New Name of Original Field
After you select Retain Original Field if Parsing Fails or Retain Original Field if Parsing Succeeds, rename the original field storing the raw log content.
For more information, see Data parsing (IIS mode).
SPL-based data processing
Simple Log Service offers custom SPL-based data processing. Compared to traditional plugins, SPL-based processing is faster, more efficient, and easier to use. This enhances the overall capabilities of Simple Log Service, allowing you to process data using SPL statements and their computing features. For more information, see the following topics:
View a Logtail configuration
Log on to the Simple Log Service console.
In the Projects section, click the one you want to manage.

On the tab, click the > icon in front of the target logstore, then choose .
Click the target Logtail configuration to view its details.
Modify a Logtail configuration
Log on to the Simple Log Service console.
In the Projects section, click the one you want to manage.

On the tab, click the > icon in front of the target logstore, then choose .
In the Logtail Configuration list, click the target Logtail configuration.
On the Logtail Configuration page, click Edit.
Modify the configuration and click Save.
For more information, see Overview of Logtail configurations.
Delete a Logtail configuration
In the Logtail Configuration list, select the target Logtail configuration and click Delete in the Actions column.
In the Delete dialog box, click OK.
After the Logtail configuration is deleted, it is detached from the machine group, and Logtail stops collecting the logs based on the configuration.
NoteTo delete a logstore, you must first delete all Logtail configurations associated with it.



