This article will take you to talk about 8 elimination strategies in Redis and see how to use them. I hope it will be helpful to everyone!
We know that Redis
cache uses memory to save data, but the memory size is limited after all. As the amount of data to be cached increases, the limit The cache space will inevitably be filled up. At this time, the cache elimination strategy is needed to delete the data. [Related recommendations: Redis Video Tutorial]
Redis’ elimination strategy can be determined based on whether data will be eliminated. They are divided into two categories:
There are 7 strategies for elimination. We can further divide them into two categories according to the scope of the elimination candidate data set:
After setting expiration Eliminated from the time data, including volatile-random, volatile-ttl, volatile-lru, volatile-lfu (newly added after Redis 4.0).
Elimination is performed in all data ranges, including allkeys-lru, allkeys-random, and allkeys-lfu (newly added after Redis 4.0).
Before redis3.0, the default is volatile-lru
; after redis3.0 (including 3.0), the default The elimination strategy is noeviction
noeviction means not to eliminate data. When the cache data is full and new write requests come in, Redis no longer provides services, but returns errors directly.
The four strategies of volatile-random, volatile-ttl, volatile-lru, and volatile-lfu are for those with expiration time set Key-value pairs. When the expiration time of the key-value pair is reached or the Redis memory usage reaches the maxmemory
threshold, Redis will eliminate the key-value pair according to these policies;
allkeys-lru, allkeys-random, allkeys-lfu The data range eliminated by these three strategies is expanded to all Key-value pairs, regardless of whether these key-value pairs have an expiration time set, the rules for filtering data for elimination are:
allkeys-random strategy, randomly select and Delete data;
allkeys-lru strategy uses the LRU algorithm to filter all data.
allkeys-lfu strategy, uses the LFU algorithm to filter across all data.
The LRU algorithm is the most commonly used algorithm recently, because LRU uses a linked list to maintain the used data list. When more data is used, it will take more time to move elements, which will inevitably affect the Redis main thread. For this reason, Redis has made some simplifications of the lru algorithm.
The core idea of the LRU strategy: If a piece of data has just been accessed, then the data must be hot data and will be accessed again.
According to this core idea, the LRU strategy in Redis will set an lru field in the RedisObject structure corresponding to each data to record the access timestamp of the data. When performing data elimination, the LRU strategy will eliminate the data with the smallest lru field value (that is, the data with the longest access time) in the candidate data set.
So, in business scenarios where data is frequently accessed, the LRU strategy can indeed effectively retain the data with the latest access time. Moreover, because the retained data will be accessed again, the access speed of business applications can be improved.
The specific method is that when accessing a key-value pair, redis will record the timestamp of the most recent access. When redis decides to eliminate data, it will randomly select N data, use them as a candidate set, and filter out the smallest timestamp. When data is eliminated next time, data with a timestamp value smaller than that of the candidate set selected for the first time will be selected and entered into a new candidate set. When the data reaches maxmemory-samples, the smallest value is eliminated.
You can set the number of selected candidate sets through this commandCONFIG SET maxmemory-samples N
Based on the strategy With its characteristics, different strategies can be selected to eliminate data for different scenarios.
allkeys-random
random strategy to eliminate the data; allkeys-lru
or volatile-lru
algorithm to keep the most recently accessed data in the cache data; volatile-lru
strategy can be used. In this way, this type of data will not be eliminated, but other data can be eliminated according to LRU rules. For more programming-related knowledge, please visit: Introduction to Programming! !
The above is the detailed content of A brief analysis of 8 elimination strategies in Redis cache. For more information, please follow other related articles on the PHP Chinese website!