避免使用bigkey

摘要

redis bigkey即数据量大的key,比如字符串value值非常大,哈希、列表、集合、有序集合元素非常多等。由于其数据大小远大于其他key,容易造成内存不均、超时阻塞、网络流量拥塞等一系列问题

危害

内存不均

导致集群内不同节点内存分配不均(redis-cluster分配最小粒度是slot,每个slot中含有多个key),间接到孩子访问请求倾斜,同时不利于集群统一管理,存在丢失数据的隐患

超时阻塞

由于redis单线程模型,在操作bigkey时很容易出现阻塞甚至导致sentinel主从切换。常见操作包括smembershgetalldel或自动过期bigkey,一般都会出现在redis慢查询日志中

网络流量拥塞

如果bigkey正好又是hotkey,则很容易产生流量拥塞问题,比如bigkey为1MB,每秒访问几千次,对于普通千M网卡(最大128MB/s)服务器来说是灭顶之灾,即便对于万兆网卡服务器来说也是很大压力

如何发现bigkey

redis-cli —bigkeys

redis-cli --bigkeys会对redis中的key进行scan采样,寻找较大的keys,不用担心阻塞

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
$ redis-cli --bigkeys
# Scanning the entire keyspace to find biggest keys as well as
# average sizes per key type. You can use -i 0.1 to sleep 0.1 sec
# per 100 SCAN commands (not usually needed).
[00.00%] Biggest string found so far 'key-419' with 3 bytes
[05.14%] Biggest list found so far 'mylist' with 100004 items
[35.77%] Biggest string found so far 'counter:__rand_int__' with 6 bytes
[73.91%] Biggest hash found so far 'myobject' with 3 fields
-------- summary -------
Sampled 506 keys in the keyspace!
Total key length in bytes is 3452 (avg len 6.82)
Biggest string found 'counter:__rand_int__' has 6 bytes
Biggest list found 'mylist' has 100004 items
Biggest hash found 'myobject' has 3 fields
504 strings with 1403 bytes (99.60% of keys, avg size 2.78)
1 lists with 100004 items (00.20% of keys, avg size 100004.00)
0 sets with 0 members (00.00% of keys, avg size 0.00)
1 hashs with 3 fields (00.20% of keys, avg size 3.00)
0 zsets with 0 members (00.00% of keys, avg size 0.00)

如何避免bigkey

主要是对bigke拆分,拆为多个key,然后mget取回来,再在业务层做合并

避免使用 Redis bigkey