When you collect logs or ship data to other cloud services, Simple Log Service adds information, such as the log source and timestamp, to your logs as key-value pairs. These are the reserved fields in Simple Log Service.
Important
- When writing log data by using an API or configuring Logtail, avoid using reserved field names for your custom fields. This prevents field name conflicts and ensures accurate query results.
__tag__Legacy shipping tasks do not support shipping fields with the prefix.- If your Logstore uses the pay-by-ingested-data billing mode, Simple Log Service does not charge you for the reserved fields it adds to your logs. For more information, see pay-by-ingested-data.
- If your Logstore uses the pay-by-feature billing mode, charges apply to the added reserved fields. Enabling an index for these fields also incurs minor fees for index traffic and storage. For more information, see pay-by-feature billing.
| Reserved field | Type | Index and analysis | Description |
__time__ |
Integer. A standard Unix timestamp. |
|
The timestamp of the log, specified when the log is written. This field can be used for log shipping, queries, and analysis. |
__source__ |
String. |
|
The source of the log, such as an IP address or machine identifier. This field can be used for log shipping, queries, analysis, and custom consumption. |
__topic__ |
String. |
|
__topic__The log topic. If you set a topic for your logs, Simple Log Service automatically adds this field. The key is and the value is the specified topic content. This field can be used for log shipping, queries, analysis, and custom consumption. For more information, see topic. |
_extract_others_ |
A string that can be deserialized into a JSON map. | This field does not exist in the log content. You do not need to create an index for it. | __extract_others____extract_others__This field is an alias for . We recommend using . |
__tag__:__client_ip__ |
String. |
|
A system tag for the public IP address of the log source. If public IP address recording is enabled, the server appends this field to the raw log upon receipt. This field can be used for log queries, analysis, and custom consumption. When you use this field in SQL analysis, you must enclose the field name in double quotation marks (""). For more information, see tag and Record public IP addresses. |
__tag__:__receive_time__ |
String. The value can be converted to an integer that represents a Unix timestamp. |
|
A system tag for the time a log arrives at the server. If public IP address recording is enabled, the server appends this field to the raw log upon receipt. This field can be used for log queries, analysis, and custom consumption. For more information, see tag and Record public IP addresses. |
__tag__:__path__ |
String. |
|
The full path of the log file, which Logtail adds automatically during collection. This field can be used for log queries, analysis, and custom consumption. When you use this field in SQL analysis, you must enclose the field name in double quotation marks (""). |
__tag__:__hostname__ |
String. |
|
The hostname of the machine where Logtail collects data, which Logtail adds automatically. This field can be used for log queries, analysis, and custom consumption. When you use this field in SQL analysis, you must enclose the field name in double quotation marks (""). |
__raw_log__ |
String. | You must manually create an index for this field. Set the index data type to text and enable log analysis as needed. | __raw_log__Contains the raw log if parsing fails. Logtail uploads the raw log to this field if you disable the Drop Failed to Parse Logs feature. This field can be used for log shipping, queries, analysis, and custom consumption. For more information, see Drop Failed to Parse Logs. |
__raw__ |
String. | You must manually create an index for this field. Set the index data type to text and enable log analysis as needed. | __raw__Contains the original raw log after successful parsing. If you enable the Upload Raw Log feature, Logtail includes the raw log content in this field along with the parsed data. This is useful for auditing and compliance scenarios. This field can be used for log shipping, queries, analysis, and custom consumption. For more information, see Upload Raw Log. |