Domain-specific language (DSL) for Simple Log Service is a Python-compatible scripting language that is used for data transformation in Simple Log Service. DSL for Simple Log Service is developed based on Python and provides more than 200 built-in functions that can be used to simplify data transformation jobs.
You can use DSL for Simple Log Service to edit functions in a flexible manner and combine functions to implement complex logic in most data transformation scenarios.
You can use DSL for Simple Log Service to distribute data to different Logstores based on specific logic and your business requirements. The names of the Logstores can be obtained by using dynamic computing or from external resources such as Object Storage Service (OSS) buckets.
You can use DSL for Simple Log Service to obtain data for enrichment from local or external resources, such as OSS buckets and ApsaraDB RDS for MySQL instances.
You can use DSL for Simple Log Service to perform regular mapping for dictionaries and tables and advanced mapping for tables.
You can use DSL for Simple Log Service to automatically refresh external resources that are loaded.
Global processing functions
DSL for Simple Log Service provides approximately 30 global processing functions. You can configure the parameters of global processing functions to control processing operations. Global processing functions accept the results of expression functions as parameters. Flow control functions are a type of global processing function and can be used together with expression functions and the following types of global processing functions:
You can control processes based on conditions by using functions such as
You can use simple search functions, such as
e_search, to process different types of logs in a flexible manner.
You can discard, retain, split, write, and replicate events.
You can retain, delete, and rename fields.
You can extract values or key-value pairs from fields based on regular expressions, Grok patterns, syslog protocols, quotes, key-value pair delimiters, and delimiters such as commas (,), vertical bars (|), and tabs (\t).
You can extract and enrich JSON data.
You can map or search for data based on a dictionary or a table.
You can obtain information about a dimension table that is used to enrich data from resources such as rule configurations, external OSS buckets, and ApsaraDB RDS for MySQL instances.
You can use a function to automatically refresh external resources based on full or incremental change logs.
You can enrich the information about some log fields. For example, you can obtain threat intelligence for an IP address and store the threat intelligence to log fields for log analysis.
DSL for Simple Log Service provides more than 200 built-in expression functions to convert events or affect the results of the global processing functions. The expression functions are suitable for most data transformation scenarios. DSL for Simple Log Service provides the following expression functions:
DSL for Simple Log Service provides a condition-based filtering mechanism that uses Lucene-like syntax, complete regular expressions, strings, generic characters, numeric value comparison, and logical operators such as AND, OR, and NOT.
You can extract, control, and compare field values. You can also perform container evaluation and operations on multiple fields.
You can convert the values of basic data types. You can also convert numbers, dictionaries, and lists.
You can perform basic, multi-value, and mathematical calculations. You can also perform operations based on mathematical parameters.
You can encode, decode, sort, reverse, replace, normalize, search, evaluate, truncate, and format multiple fields. You can also perform evaluation based on character sets.
You can convert date and time values. You can obtain date and time attributes, date and time values, UNIX timestamps, and date and time strings. You can also modify and compare date and time values.
You can extract, match, evaluate, replace, and truncate fields.
DSL for Simple Log Service provides more than 400 built-in Grok patterns. Grok patterns can be replaced.
You can extract and filter JSON, Protobuf, and XML data.
You can parse IP addresses and convert data.
You can encode and decode text in the SHA1, SHA256, SHA512, MD5, HTML, URL, or Base64 format.