All Products
Search
Document Center

DataWorks:Redis data source

Last Updated:Feb 02, 2026

Use Redis Writer in DataWorks Data Integration to write data to a Redis data source. This topic explains how to configure Redis Writer for offline data synchronization.

Limitations

  • You can run the synchronization job on serverless resource groups for Data Integration (recommended) or exclusive resource groups for Data Integration.

  • When you use the List data type, rerunning a synchronization job is not an idempotent operation. You must manually clear the data from Redis before you rerun the job.

    Important

    Redis Writer does not support Bloom filter configuration. As a workaround for handling duplicate data, you can add a node, such as a Shell, Python, or PyODPS node, before or after the synchronization node in your workflow.

Supported data types

Redis supports a rich set of value types, including String, List, Set, Sorted Set, and Hash. For more information about Redis, see redis.io.

Configure a synchronization job

For information about the entry point for and the procedure of configuring a synchronization task, see the following configuration guides.

Appendix: Script sample and parameters

Configure a batch synchronization task by using the code editor

If you want to configure a batch synchronization task by using the code editor, you must configure the related parameters in the script based on the unified script format requirements. For more information, see Configure a task in the code editor. The following information describes the parameters that you must configure for data sources when you configure a batch synchronization task by using the code editor.

Writer script sample

The following sample script shows a synchronization job that uses MySQL Reader to read data from a MySQL source and Redis Writer to write the data to a Redis destination.

{
    "type":"job",
    "version":"2.0",  // The version number.
    "steps":[
        { // The following code is for the reader. For parameter details, see the documentation for the corresponding Reader plugin.
            "stepType":"mysql",   
            "parameter": {
                "envType": 0,
                "datasource": "xc_mysql_demo2",
                "column": [
                    "id",
                    "value",
                    "table"
                ],
                "connection": [
                    {
                        "datasource": "xc_mysql_demo2",
                        "table": []
                    }
                ],
                "where": "",
                "splitPk": "",
                "encoding": "UTF-8"
            },
            "name":"Reader",
            "category":"reader"
        },
        {// The following code is for the writer.
            "stepType":"redis",                    // The plugin name. Set to redis for Redis Writer.
            "parameter":{                          // The main parameters for Redis Writer.
                "expireTime":{                     // The expiration time for the key-value pair. You can configure it in seconds or as a Unix timestamp.
                    "seconds":"1000"
                }, 
                "keyFieldDelimiter":"u0001",       // The delimiter for concatenating multiple columns into a Redis key.
                "dateFormat":"yyyy-MM-dd HH:mm:ss",// The format for Date-type values when written to Redis.
                "datasource":"xc_mysql_demo2",     // The data source name. This must match the name of the data source you added.
                "envType": 0,                      // The environment type. 0 for the production environment, 1 for the development environment.
                "writeMode":{                      // The write mode.
                    "type":"string",               // The value type.
                    "mode":"set",                  // The write mode for the specified value type.
                    "valueFieldDelimiter":"u0001"  // The delimiter for concatenating multiple columns into a value.
                             },
                "keyIndexes":[0,1],                // Maps source columns to the Redis key. Specifies the indexes of the source columns to use as the key. Column indexes start from 0. If you use the first and second columns as a composite key, set this to [0,1].
                "batchSize":"1000",                // The number of records to write in a single batch.
        "column": [                        // This parameter applies to the set operation for the String data type. If you do not configure this parameter, the value is a string of concatenated values separated by a delimiter (CSV format). For example, if the value of age is 18 and the value of sex is male, the Redis value is "18::male". If you configure this parameter as shown, the value is written in JSON format, including the original column names and their values. For example, if id is 1, name is "John", age is 18, and sex is "male", the Redis value is {"id":1,"name":"John","age":18,"sex":"male"}.
                {
                "name": "id",
                "index": "0"

                },
                {
                "name": "name",
                "index": "1"
                },
                {
                "name": "age",
                "index": "2"
                },
                {
                "name": "sex",
                "index": "3"
                }
            ]
            },
            "name":"Writer",
            "category":"writer"
        }
    ],
    "setting":{
        "errorLimit":{
            "record":"0"                           // The maximum allowed error record count.
        },
        "speed":{
            "throttle":true, // Enables or disables throttling. If set to false, the mbps parameter is ignored.
            "concurrent":1,  // The job concurrency.
            "mbps":"12"      // The maximum transfer rate in MB/s.
        }
    },
    "order":{
        "hops":[
            {
                "from":"Reader",
                "to":"Writer"
            }
        ]
    }
}

Writer parameters

Parameter

Description

Required

Default

expireTime

The expiration time for the key in Redis. If you do not specify this parameter, the default value 0 is used, which means the key never expires.

You can configure expireTime in one of the following ways:

  • seconds: Specifies the number of seconds from the current time until the key expires.

  • unixtime: Specifies the expiration time as a Unix timestamp, which is the number of seconds that have elapsed since 1970-01-01 00:00:00 UTC.

No

0 (The key never expires.)

keyFieldDelimiter

The delimiter used to concatenate multiple columns to form a Redis key, for example, key=key1\u0001id. This parameter is required when you combine multiple columns to form a key. You can omit this parameter if the key consists of a single column.

No

\u0001

dateFormat

The format for Date-type values when written to Redis, such as yyyy-MM-dd HH:mm:ss.

No

None

datasource

The name of the data source. This name must match the one you configured in the DataWorks console.

Yes

None

selectDatabase

The index of the destination database. Valid values range from "0" to "N-1", where N is the number of `databases` configured in Redis. This parameter is unavailable for Redis clusters.

No

Defaults to database 0.

writeMode

The data type of the value to be written to Redis. Redis Writer supports the following five types:

  • String (string)

  • List (list)

  • Set (set)

  • Sorted Set (zset)

  • Hash (hash)

The configuration of writeMode varies by data type. For details, see the writeMode parameters section below.

Note

You must configure writeMode with one of the five supported data types. Only one type is allowed. If you do not configure this parameter, the default value string is used.

No

string

keyIndexes

The zero-based indexes of the source columns to use for the Redis key.

  • To use a single column from the source as the Redis key, specify its index. For example, to use the first column as the key, set the value to 0.

  • To use multiple source columns as a composite Redis key, specify their indexes in an array. For example, to use the second and third columns as a composite key, set this to [1,2].

Note

Redis Writer uses all columns not specified in keyIndexes as the value. To synchronize only specific columns, configure the column parameter in the Reader plugin to filter them.

Yes

None

batchSize

The number of records to write in a single batch. A larger value can reduce network interactions with Redis and improve throughput. However, setting this value too high may cause an out-of-memory (OOM) error in the synchronization job process.

No

1000

timeout

The write operation timeout in milliseconds.

No

30000

redisMode

The Redis deployment mode. Valid values:

  • Cluster mode: Set redisMode to ClusterMode.

    In this mode, Redis Writer connects directly to the Redis cluster. This mode is typically used for self-managed Redis clusters and Alibaba Cloud Redis instances with direct connection addresses. Cluster mode supports batch writing.

  • Non-cluster mode: Leave redisMode empty or do not configure it.

    This mode is typically used for Alibaba Cloud Redis instances with proxy addresses, read/write splitting addresses, or standard edition addresses. Non-cluster mode does not support batch writing.

Note

No

None

column

The configuration for columns to be written to Redis. This parameter applies when `writeMode.type` is `string` and `writeMode.mode` is `set`.

  • If you do not configure this parameter, the value is written as a string of concatenated values separated by a delimiter (CSV format). For example, if a record has values `18` and `male`, the value in Redis might be `"18::male"`.

  • If you configure this parameter, for example, "column": [{"index":"0", "name":"id"}, {"index":"1", "name":"name"}], the value is written in JSON format, such as {"id":"","name":""}. For example, if the `id` is `1` and `name` is `John`, the value in Redis is {"id":"1","name":"John"}.

No

None

writeMode parameters

Value type

Type parameter

Mode parameter

valueFieldDelimiter parameter

Sample configuration

String

Set type to string.

For the String value type, mode specifies the write mode:

  • Set mode to set.

  • If a key already exists, its value is overwritten.

valueFieldDelimiter is the delimiter for concatenating values. The default value is \u0001.

  • This parameter is used when a source data row has more than one value column. For example, if there are three value columns, they are concatenated by the delimiter, such as value1\u0001value2\u0001value3.

  • If the source data has only one key column and one value column, you do not need to configure this parameter.

"writeMode":{
        "type": "string",
        "mode": "set",
        "valueFieldDelimiter": "\u0001"
        }

List

Set type to list.

For the List value type, mode can be one of the following:

  • lpush: Pushes an element to the head (left) of the list.

  • rpush: Pushes an element to the tail (right) of the list.

"writeMode":{
    "type": "list",
    "mode": "lpush|rpush",
    "valueFieldDelimiter": "\u0001"
}

Set

Set type to set.

For the Set value type, mode specifies the write mode:

  • Set mode to sadd to add members to the set.

  • If a key with the same name exists but is of a different type, it is overwritten.

"writeMode":{
        "type": "set",
        "mode": "sadd",
        "valueFieldDelimiter": "\u0001"
        }

Sorted Set

Set type to zset.

For the Sorted Set value type, mode specifies the write mode:

  • Set mode to zadd to add members to the sorted set.

  • If a key with the same name exists but is not a sorted set, its value is overwritten.

Not required.

"writeMode":{
        "type": "zset",
        "mode": "zadd"
        }
Note

When the value type is `zset`, each source record must provide exactly one score/member pair in addition to the key column(s). The score must precede the member. This format ensures that Redis Writer can parse the data correctly.

Hash

Set type to hash.

For the Hash value type, mode specifies the write mode:

  • Set mode to hset to add data to the hash.

  • If a key with the same name exists but is not a hash, its value is overwritten.

Not required.

"writeMode":{
        "type": "hash",
        "mode": "hset"
        }
Note

When the value type is `hash`, each source record must provide exactly one field/value pair in addition to the key column(s). The field must precede the value. This format ensures that Redis Writer can parse the data correctly.