January 4, 2025

Debugging Redis Memory Issues: A Visual Approach

CorbinCorbin

Your Redis instance is using 8GB of memory. You allocated 4GB. Alerts are firing. The ops team is paging you.

Time to find out where all that memory went.

Understanding Redis Memory

First, get the big picture:

redis-cli INFO memory

Key metrics:

Metric Meaning
used_memory Total allocated by Redis
used_memory_rss Memory from OS perspective (includes fragmentation)
used_memory_dataset Actual data size
used_memory_overhead Redis internal structures
mem_fragmentation_ratio RSS / used_memory (should be ~1.0)

If mem_fragmentation_ratio is way above 1.0, you have fragmentation issues. Below 1.0 means you're swapping (very bad).

Step 1: Find the Big Keys

The usual suspects are a few keys holding most of the data.

CLI approach:

redis-cli --bigkeys

This scans every key and reports the largest per data type. It's blocking and slow on large databases.

Better approach:

redis-cli --memkeys --samples 10000

This samples keys and estimates memory usage. Faster, but not comprehensive.

Problem: Both commands give you key names, but not the full picture. You see "big_hash" uses 500MB but not why, or what's inside it.

Step 2: Deep Dive with MEMORY USAGE

For specific keys:

redis-cli MEMORY USAGE user:sessions

Returns bytes used by that key. But if you have thousands of keys matching a pattern, you'd need a script:

redis-cli --scan --pattern "cache:*" | while read key; do
  size=$(redis-cli MEMORY USAGE "$key")
  echo "$key: $size bytes"
done | sort -t: -k2 -n -r | head -20

This is slow and tedious. You're running MEMORY USAGE for every key individually.

Step 3: Visual Memory Analysis

This is where GUI tools save hours.

In Redimo:

  1. Connect to your Redis instance
  2. Create a Pattern Monitor (e.g., cache:*)
  3. View memory distribution by key pattern

The dashboard shows:

  • Total memory for matched keys
  • Memory per key (sortable)
  • Type distribution (are hashes eating more than strings?)
  • Keys without TTL (potential memory leaks)

Click any key to see its structure. A hash using 100MB? Expand it. See which fields are huge.

Common Memory Hogs

1. Serialized JSON in Strings

SET user:1001:profile "{\"name\":\"...\",\"preferences\":{...},\"history\":[...]}"

JSON in strings is convenient but inefficient. You can't query parts of it - you load the whole thing every time.

Fix: Use Redis Hashes for structured data, or consider RedisJSON for complex documents.

2. Unbounded Lists

LPUSH activity:feed:user:1001 "{...}"
# Repeat millions of times, no trimming

Activity feeds grow forever unless you trim them.

Fix:

LPUSH activity:feed:user:1001 "{...}"
LTRIM activity:feed:user:1001 0 999  # Keep last 1000

3. Expired Keys Not Cleaned Up

Redis expires keys lazily - it doesn't immediately delete them when TTL hits zero. Keys are removed when:

  • Accessed (active expiration)
  • Random sampling during background cycles (passive expiration)

If you have millions of expired keys and low traffic, they sit around consuming memory.

Check:

redis-cli INFO stats | grep expired_keys

Fix: Run SCAN with no pattern occasionally to trigger cleanup, or increase hz in config for more frequent active expiration.

4. Large Sorted Sets for Leaderboards

ZADD leaderboard 1000 user:1
ZADD leaderboard 999 user:2
# ... millions of users

Sorted sets are efficient, but at scale they still consume significant memory.

Fix: Partition by time period (daily/weekly leaderboards), archive old data.

5. Client Output Buffers

Sometimes it's not your data - it's connected clients.

redis-cli CLIENT LIST

Look for clients with high omem (output buffer memory). A slow subscriber or a client not reading responses can accumulate buffers.

Fix:

CONFIG SET client-output-buffer-limit "normal 256mb 128mb 60"

This disconnects clients using too much buffer memory.

Memory Optimization Techniques

Use Appropriate Data Types

Scenario Bad Good
Object storage JSON string Hash
Unique items List (with dedup logic) Set
Ranked data Manual sorting Sorted Set
Counters GET/SET/INCR string String (native)
Small integers String "123" Encoded integer

Redis optimizes small hashes, lists, and sets into compact encodings. Check hash-max-ziplist-* settings.

Enable Compression (Application Side)

// Before storing
const compressed = zlib.gzipSync(JSON.stringify(data));
await redis.set('key', compressed);

// When reading
const compressed = await redis.getBuffer('key');
const data = JSON.parse(zlib.gunzipSync(compressed));

For large values, compression can cut memory 50-80%.

Use Redis Hashes for Small Objects

Instead of:

SET user:1001:name "John"
SET user:1001:email "john@example.com"
SET user:1001:plan "pro"

Use:

HSET user:1001 name "John" email "john@example.com" plan "pro"

One hash uses less memory than multiple strings due to shared overhead.

Set Maxmemory Policy

CONFIG SET maxmemory 4gb
CONFIG SET maxmemory-policy allkeys-lru

Policies:

  • noeviction: Return errors when full (safe but painful)
  • allkeys-lru: Evict least recently used keys
  • volatile-lru: Only evict keys with TTL set
  • allkeys-random: Random eviction
  • volatile-ttl: Evict keys closest to expiration

Choose based on your use case. Cache? LRU makes sense. Important data? Maybe noeviction with alerts.

Monitoring Over Time

One-time analysis isn't enough. Memory issues often develop gradually.

Track these metrics:

  • used_memory trend over days/weeks
  • Key count growth rate
  • Evicted keys (if maxmemory is set)
  • Fragmentation ratio changes

Redimo's dashboard shows key counts and memory per pattern monitor. Set up multiple monitors for different key prefixes to see which areas are growing.

Quick Reference

Symptom Likely Cause Fix
Memory >> data size Fragmentation Restart Redis (planned maintenance)
Steady memory growth Missing TTLs Audit keys without expiration
Sudden spike Large key created Find with --bigkeys, investigate
High eviction rate maxmemory too low Increase or optimize data
High fragmentation Many small allocations Use hashes, increase activedefrag

Tools Summary

Tool Pros Cons
INFO memory Quick overview No key-level detail
--bigkeys Finds obvious hogs Slow, blocking
--memkeys Faster sampling Estimates only
MEMORY USAGE Exact per-key Manual, one at a time
Redimo Visual, pattern-based, explorable Requires install

Memory debugging shouldn't feel like archaeology. With the right tools, you can find problems in minutes instead of hours.

Download Redimo and see exactly where your memory is going.

Ready for Download

Try Redimo Today

Pattern Monitor, CRUD operations, SSH Tunneling.
Everything you need to manage Redis at light speed.

macOS & Windows