edit-icon download-icon

Redis memory usage analysis

Last Updated: Oct 10, 2017

Background

Users often want to see the memory usage for data in their Redis instances.To make sure the normal use of online instances, we generally use bgsave to generate a dump.rdp file and then use this file with redis-rdb-tools and SQlite to perform a static analysis. The analysis process is simple and practical. Redis users can easily learn it.

Create a backup

If you are using a Redis client, run bgsave to generate an .rdb file.

If you have subscribed to an ApsaraDB for Redis instance, perform data backup and download the data on the console. The downloaded data is saved as an .rdb file. See the following figure for more information.

download backup file

Generate a memory snapshot

The redis-rdb-tools is a Python tool used to parse RDB files. It provides three main functions:

  • Generate memory snapshots.
  • Dump RDB files into JSON format.
  • Use standard diff tools to compare dump files.

When analyzing the memory use, we mainly use the memory snapshot generation function.

Install the redis-rdb-tools

You can install the redis-rdb-tools in either of the following two ways.

Install it from Python Package Index (PyPI)

  1. pip install rdbtools

Install it from source code

  1. git clone https://github.com/sripathikrishnan/redis-rdb-tools
  2. cd redis-rdb-tools
  3. sudo python setup.py install

Use redis-rdb-tools to generate a memory snapshot

Run the following command to generate a memory snapshot:

  1. rdb -c memory dump.rdb > memory.csv

The generated memory report is in CSV format. It contains database IDs, data types, keys, memory usage (bytes), and encoding. The memory usage consists of the key, value, and other values.

Note: The memory usage value is a theoretical approximation. Generally, it is slightly lower than the actual value.Here’s a sample memory.csv file:

  1. $head memory.csv
  2. database,type,key,size_in_bytes,encoding,num_elements,len_largest_element
  3. 0,string,"orderAt:377671748",96,string,8,8
  4. 0,string,"orderAt:413052773",96,string,8,8
  5. 0,sortedset,"Artical:Comments:7386",81740,skiplist,479,41
  6. 0,sortedset,"pay:id:18029",2443,ziplist,84,16
  7. 0,string,"orderAt:452389458",96,string,8,8

Analyze a memory snapshot

SQLite is a lightweight database. Import the previously generated .csv file into the database, and you can use SQL statements to easily perform various analyses on Redis memory data.

Import the .csv file

  1. sqlite3 memory.db
  2. sqlite> create table memory(database int,type varchar(128),key varchar(128),size_in_bytes int,encoding varchar(128),num_elements int,len_largest_element varchar(128));
  3. sqlite>.mode csv memory
  4. sqlite>.import memory.csv memory

Make analyses based on the imported data. Here are a few simple examples:

Query the number of keys

  1. sqlite>select count(*) from memory;

Query the total memory usage

  1. sqlite>select sum(size_in_bytes) from memory;

Query top 10 keys with highest memory usage

  1. sqlite>select * from memory order by size_in_bytes desc limit 10;

Query lists with over 1000 members

  1. sqlite>select * from memory where type='list' and num_elements > 1000 ;

Conclusion

You can easily perform static analyses on the memory usage of an ApsaraDB for Redis instance by using the redis-rdb-tools and SQLite. The process is simple. You only need to obtain the .rdb file.

  1. rdb -c memory dump.rdb > memory.csv;
  2. sqlite3 memory.db
  3. sqlite> create table memory(database int,type varchar(128),key varchar(128),size_in_bytes int,encoding varchar(128),num_elements int,len_largest_element varchar(128));
  4. sqlite>.mode csv memory
  5. sqlite>.import memory.csv memory

By using the preceding approach, a user found a list with over 10 GB data, and another user found a string value of over 43 MB. These analysis results addressed users’ concerns and helped them eliminate potential business risks and identify the bottlenecks of their business performance.

Thank you! We've received your feedback.