This topic describes how to create Logstash configurations for log collection and
processing.
Plug-ins
- logstash-input-file plug-in
The logstash-input-file plug-in collects logs by using the tail command. For more
information, visit logstash-input-file.
- logstash-output-logservice plug-in
The logstash-output-logservice plug-in processes the collected logs and uploads the
logs to Log Service.
Procedure
- Create a configuration file in the C:\logstash-2.2.2-win\conf\ directory.
Replace the logstash-2.2.2-win version with your actual Logstash version. You can
create a configuration file for each type of log. The file name is in the *.conf format.
- Create configurations for log collection and processing.
The following script shows sample log collection and processing configurations. The
log collection configurations are included in the input parameter. The log processing configurations are included in the output parameter .
Note
- The configuration file must be encoded in UTF-8 without BOM. We recommend that you
use the Notepad++ editor to modify the file encoding format.
- The path field indicates the directory of a file. If you specify this field, you must use
delimiters in the UNIX format, for example, C:/test/multiline/*.log. Otherwise, fuzzy match is not supported.
- The values of the type fields in a configuration file must be the same. If multiple Logstash configuration
files are created for a server, the values of the type fields in the files must be the same.
input {
file {
type => "iis_log_1"
path => ["C:/inetpub/logs/LogFiles/W3SVC1/*.log"]
start_position => "beginning"
}
}
filter {
if [type] == "iis_log_1" {
#ignore log comments
if [message] =~ "^#" {
drop {}
}
grok {
# check that fields match your IIS log settings
match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{IPORHOST:site} %{WORD:method} %{URIPATH:page} %{NOTSPACE:querystring} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clienthost} %{NOTSPACE:useragent} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:scstatus} %{NUMBER:time_taken}"]
}
date {
match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
timezone => "Etc/UTC"
}
useragent {
source=> "useragent"
prefix=> "browser"
}
mutate {
remove_field => [ "log_timestamp"]
}
}
}
output {
if [type] == "iis_log_1" {
logservice {
codec => "json"
endpoint => "***"
project => "***"
logstore => "***"
topic => ""
source => ""
access_key_id => "***"
access_key_secret => "***"
max_send_retry => 10
}
}
}
Table 1. Fields in the log processing configurations
Field |
Description |
endpoint |
The endpoint of the Log Service project. |
project |
The Log Service project. |
logstore |
The Logstore. |
topic |
The topic of the log entry. |
source |
The source of the log entry. If the field is not specified, the IP address of the
local server is automatically obtained.
|
access_key_id |
The AccessKey ID of your Alibaba Cloud account. |
access_key_secret |
The AccessKey secret of your Alibaba Cloud account. |
max_send_retry |
The maximum number of retries that are performed when a data packet fails to be sent
to Log Service. Data packets that fail to be sent during the retry period are dropped.
The retry interval is 200 milliseconds.
|
- Restart Logstash. For more information, see Start the service.
What to do next
Use PowerShell to launch the logstash.bat process. The logstash process runs in the
frontend. In most cases, the process is used to test and debug log collection. After
the debugging is completed, we recommend that you configure Logstash as a Windows
service. You can run Logstash in the backend and set auto-run at startup for Logstash.
For more information, see Configure Logstash as a Windows service.