All Products
Search
Document Center

Simple Log Service:Syntax comparison for upgrades

Last Updated:May 09, 2025

This topic describes the syntax comparison between the old and new versions of data transformation.

For more information about the comparison between Simple Log Service Processing Language (SPL) and SQL syntax in various data transformation scenarios, see Scenario comparison between SPL and SQL.

Synchronize data (no processing logic required)

Version

Script description

Old version

The DSL script is empty.

New version

The SPL rule is empty.

Filter data: exact match by text type

Version

Script description

Old version

e_keep(v("level") == "ERROR") or

e_drop(v("level") != "ERROR") or

e_if(v("level") != "ERROR", e_drop()) or

e_keep(e_search("level==ERROR"))

New version

| where level='ERROR'

Filter data by numeric type

Version

Script description

Old version

e_keep(ct_int(v("status"))>=400)

New version

| where cast(status as bigint)>=400

Filter data: fuzzy match

Version

Script description

Old version

e_keep(op_in(v("level"), "ERROR")) or

e_keep(e_search("level: ERROR") or

e_if(op_not_in(v("level"), "ERROR"), e_drop())

New version

| where level like '%ERROR%'

Add a field used to extract or construct a single key information item

Version

Script description

Old version

  1. Extract an information item by using a regular expression.

    e_set("version", regex_select(v("data"), r'"version":\d+'))

  2. Extract an information item by using JSON. For more information about the JSON query syntax in the old version of data transformation, see JMESPath syntax.

    e_set("version", json_select(v("data"), "version"))

New version

  1. Extract an information item by using a regular expression.

    | extend version=regexp_extract(data, '"version":\d+')

  2. Extract an information item by using JSON. For more information about the JSON object path reference in the new version of data transformation, see JsonPath on GitHub.

    | extend version=json_extract(data, '$.version')

Parse and format time values

Version

Script description

Old version

  1. Extract the time field __time__ from a log.

    e_set(
        "__time__", 
        dt_parsetimestamp(
            v("time"), 
            fmt="%Y/%m/%d %H-%M-%S",
        ),
    )
  2. Format the time.

    e_set(
        "time",
        dt_strftime(
            dt_parse(
                v("time"), 
                fmt="%Y/%m/%d %H-%M-%S",
            ), 
            fmt="%Y-%m-%d %H:%M:%S",
        ),
    )

New version

  1. Extract the time field __time__ from a log.

    | extend time=date_parse(time, '%Y/%m/%d %H-%i-%S')

    | extend __time__=cast(to_unixtime(time) as bigint)

  2. Format the time.

    | extend time=date_parse(time, '%Y/%m/%d %H-%i-%S')

    | extend time=date_format(time, '%Y-%m-%d %H:%i:%S')

Process and filter fields

Version

Script description

Old version

  1. Search for fields in exact mode.

    e_keep_fields("__tag__:node", "path", regex=False)

  2. Search for fields by mode.

    e_keep_fields("__tag__:.*", regex=True)

  3. Rename specific fields.

    e_rename("__tag__:node", node)

  4. Exclude fields by mode.

    e_drop_fields("__tag__:.*", regex=True)

New version

  1. Search for fields in exact mode.

    | project node="__tag__:node", path

  2. Search for fields by mode.

    | project -wildcard "__tag__:*"

  3. Rename specific fields.

    | project-rename node="__tag__:node"

  4. Exclude fields by mode.

    | project-away -wildcard "__tag__:*"

Extract multiple fields by using a regular expression

Version

Script description

Old version

e_regex("data", r"(\S+)\s+(\w+)", ["time", "level"])

New version

| parse-regexp data, '(\S+)\s+(\w+)' as time, level

Expand key-value pairs of JSON objects into data fields

Version

Script description

Old version

For more information about the JSON query syntax in the old version of data transformation, see JMESPath syntax.

e_json("data", depth=1, jmes="x.y.z")

New version

For more information about the JSON object path reference in the new version of data transformation, see JsonPath on GitHub.

| parse-json -path='$.x.y.z' data

Extract the content in CSV files as data fields

Version

Script description

Old version

e_csv("data", ["time", "addr", "user"], sep="\0", quote='"')

New version

  1. Extract the content by using single-character delimiters. For more information, see CSV, Comma Separated Values (RFC 4180).

    | parse-csv -delim='\0' -quote='"' data as time, addr, user

  2. Extract the content by using multi-character delimiters.

    | parse-csv -delim='^_^' data AS time, addr, user

Process logical branches: parallel branches

Version

Script description

Old version

e_if(
    e_has("a"), e_set("mode_a", "1"), 
    e_has("b"), e_set("mode_b", "1"),
)

The preceding code is equivalent to the following Python code:

if e_has("a"):
    e_set("mode_a", "1")
if e_has("b"):
    e_set("mode_b", "1")

New version

.let a = *
| where a is not null
| extend mode_a='1';

.let b = *
| where b is not null
| extend mode_b='1';

$a;
$b;

Process logical branches: mutually exclusive branches (if-else and switch)

Version

Script description

Old version

e_switch(
    e_has("a"), e_keep_fields("x", "y", "z"), 
    e_has("b"), e_keep_fields("u", "v"),
    default=e_keep_fields("w"),
)

The preceding code is equivalent to the following Python code:

if e_has("a"):
    e_keep_fields("x", "y", "z")
elif e_has("b"):
    e_keep_fields("u", "v")
else:
    e_keep_fields("w")

New version

.let src = *
| extend mode=case
    when a is not null then 1
    when b is not null then 2
    else 0
  end;

.let a = $src | where mode=1 | project x, y, z;
.let b = $src | where mode=2 | project u, v;
.let c = $src | where mode=0 | project w;

$a;
$b;
$c;

Dynamically select a destination project and a Logstore based on rules

Version

Script description

Old version

e_output(project=v("dst_project"), logstore=v("dst_logstore"))

New version

| extend "__tag__:__sls_etl_output_project__"=dst_project

| extend "__tag__:__sls_etl_output_logstore__"=dst_logstore

Route the transformation results to the required shard based on a specific hash key

Version

Script description

Old version

e_output(hash_key_field="key_field")

New version

| extend "__tag__:__sls_etl_output_hash_key__"=to_hex(md5(to_utf8(key_field))

Encapsulate and serialize fields into JSON data and store the JSON data in a new field

Version

Script description

Old version

  1. Encapsulate all fields.

e_pack_fields("content", include="\w+")

  1. Use a regular expression to extract values from the dict field, encapsulate the values, and then assign the encapsulated values to the name field.

e_regex("dict", r"(\w+):(\d+)", {r"k_\1": r"\2"}, pack_json="name")

New version

  1. Encapsulate all fields.

| pack-fields -include='\w+' as content

  1. Use a regular expression to extract values from the dict field, encapsulate the values, and then assign the encapsulated values to the name field.

| parse-kv -prefix='k_' -regexp dict, '(\w+):(\d+)' | pack-fields -include='k_.*' as name

Converts logs to metrics that can be stored in a Metricstore

Version

Script description

Old version

e_to_metric(names="rt", labels="host")

New version

| log-to-metric -names='["rt"]' -labels='["host"]'