MEMORY-USAGE
MEMORY USAGE tells you exactly how many bytes a key consumes. Find the bloated keys eating your RAM and optimize your data structures.
You'll Learn
- Per-key memory with MEMORY USAGE
- Memory overview with MEMORY STATS
- Finding memory hogs
- Optimization techniques
See Your Data, Not Terminal Text
Redimo visualizes every Redis data type beautifully. Edit inline, undo mistakes, stay safe in production.
1. MEMORY USAGE: Per-Key Analysis
MEMORY USAGE returns the total bytes used by a key, including value, key name overhead, and internal data structure overhead.
Basic Usage
# Check memory for a key
MEMORY USAGE mykey
(integer) 56
# With sample count for large collections
# Samples N random elements instead of all
MEMORY USAGE large-hash SAMPLES 1000
(integer) 1048576SAMPLES Option
SAMPLES n to estimate based on random samples. Default is 5. Use higher values for better accuracy.2. Understanding the Number
The byte count includes more than just your data:
Memory Components
SET tiny "x"
MEMORY USAGE tiny
(integer) 56
# Wait, "x" is 1 byte. Why 56?
# Breakdown:
# - Key name "tiny" (4 bytes)
# - Redis object header (~16 bytes)
# - String encoding overhead
# - Memory allocator overhead (rounding to jemalloc size class)
# Larger values have better efficiency
SET larger "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" # 33 chars
MEMORY USAGE larger
(integer) 80 # Only 24 more bytes for 32 more charsOverhead Sources
- • Redis object header (type, encoding, LRU, refcount)
- • Key string storage
- • Data structure internal pointers
- • jemalloc allocation rounding
3. MEMORY STATS: Overview
MEMORY STATS gives a detailed breakdown of Redis's total memory usage.
MEMORY STATS
127.0.0.1:6379> MEMORY STATS
1) "peak.allocated"
2) (integer) 10485760
3) "total.allocated"
4) (integer) 8388608
5) "startup.allocated"
6) (integer) 1048576
7) "replication.backlog"
8) (integer) 1048576
9) "clients.slaves"
10) (integer) 0
11) "clients.normal"
12) (integer) 49694
13) "aof.buffer"
14) (integer) 0
15) "keys.count"
16) (integer) 10000
17) "keys.bytes-per-key"
18) (integer) 500
19) "dataset.bytes"
20) (integer) 5000000
21) "dataset.percentage"
22) "59.604648590087890625"
...Key Metrics
total.allocated- Current memorypeak.allocated- Historical peakdataset.bytes- Your datadataset.percentage- Data vs overhead
Overhead Sources
replication.backlog- Replication bufferclients.normal- Client buffersaof.buffer- AOF bufferstartup.allocated- Base overhead
4. MEMORY DOCTOR
MEMORY DOCTOR provides automatic recommendations based on current memory state.
MEMORY DOCTOR
127.0.0.1:6379> MEMORY DOCTOR
# Possible outputs:
# All good:
"Sam, I have no memory problems"
# Issues detected:
"High fragmentation detected.
Consider restarting Redis or using MEMORY PURGE"
"Peak memory is much higher than current usage.
Run MEMORY PURGE to release memory back to OS"
"Many keys with no TTL are using significant memory.
Consider setting expiration on appropriate keys"5. Finding Memory Hogs
Combine SCAN with MEMORY USAGE to find your biggest keys.
CLI Approach
# Find biggest keys (redis-cli)
redis-cli --bigkeys
# Sample output:
# Biggest string: session:abc123 (5242880 bytes)
# Biggest hash: user:1001:profile (1048576 bytes)
# Biggest list: queue:pending (10000 items)
# Memory analysis with sampling
redis-cli --memkeys
# Custom script to find top N by memory
redis-cli SCAN 0 COUNT 1000 | while read key; do
size=$(redis-cli MEMORY USAGE "$key")
echo "$size $key"
done | sort -rn | head -20--bigkeys vs --memkeys
--bigkeys finds largest by element count (items in list, fields in hash). --memkeys estimates actual memory usage. A hash with 10 fields could use more memory than a list with 1000 items if the hash values are large.6. INFO Memory Section
INFO memory
127.0.0.1:6379> INFO memory
# Memory
used_memory:8388608
used_memory_human:8.00M
used_memory_rss:12582912
used_memory_rss_human:12.00M
used_memory_peak:10485760
used_memory_peak_human:10.00M
mem_fragmentation_ratio:1.50
mem_fragmentation_bytes:4194304
mem_allocator:jemalloc-5.2.1Fragmentation Ratio
~1.0- Healthy, minimal fragmentation>1.5- Significant fragmentation, memory is wasted<1.0- Redis using swap! Very bad for performance
7. MEMORY PURGE
MEMORY PURGE attempts to release memory back to the OS. Useful after deleting large amounts of data.
MEMORY PURGE
# Delete lots of data
DEL huge-key-1 huge-key-2 huge-key-3
# Memory might not drop immediately (allocator holds pages)
INFO memory | grep used_memory_rss_human
used_memory_rss_human:1.00G # Still high
# Purge to release back to OS
MEMORY PURGE
OK
INFO memory | grep used_memory_rss_human
used_memory_rss_human:500.00M # BetterWhen to PURGE
8. Optimization Strategies
Use Efficient Data Types
# Bad: Many small strings
SET user:1001:name "John"
SET user:1001:email "john@example.com"
SET user:1001:age "30"
# 3 keys × ~50 bytes overhead each = ~150 bytes overhead
# Good: One hash
HSET user:1001 name "John" email "john@example.com" age "30"
# 1 key overhead, hash uses ziplist encoding for small dataEnable Compression
# In redis.conf - tune ziplist/listpack thresholds
# Hashes: use ziplist for small hashes
hash-max-ziplist-entries 512
hash-max-ziplist-value 64
# Lists: use listpack
list-max-ziplist-size -2
# Sets: use intset for integer-only sets
set-max-intset-entries 512
# Sorted Sets: use ziplist for small zsets
zset-max-ziplist-entries 128
zset-max-ziplist-value 64Application-Level Compression
// Compress large values before storing
const compressed = zlib.gzipSync(JSON.stringify(largeObject));
await redis.set('key', compressed);
// Decompress when reading
const data = JSON.parse(zlib.gunzipSync(await redis.getBuffer('key')));9. CLI vs Redimo
CLI Challenges
- • Running MEMORY USAGE on each key manually
- • Parsing --bigkeys output
- • No visual size comparison
- • Scripting required for pattern analysis
Redimo Benefits
- • Memory column in key browser
- • Sort keys by memory usage
- • Pattern monitor shows total memory per pattern
- • Visual identification of large values
Quick Reference
| Command | Purpose |
|---|---|
MEMORY USAGE key | Bytes used by specific key |
MEMORY STATS | Detailed memory breakdown |
MEMORY DOCTOR | Automatic recommendations |
MEMORY PURGE | Release memory to OS |
INFO memory | Memory overview and fragmentation |
--bigkeys | Find largest keys (CLI flag) |
--memkeys | Estimate memory per key (CLI flag) |
Analyze Your Memory
Memory problems are easier to fix when you can see them. Find your memory hogs with Redimo.
Download Redimo - It's Free