Cache Avalanche#
Cache Avalanche: When a large amount of cached data expires (becomes invalid) at the same time or when Redis fails, if there are a lot of user requests at that moment that cannot be processed in Redis, these requests will directly access the database, leading to a sudden increase in database pressure. In severe cases, this can cause the database to crash, resulting in a complete system failure.
Solutions:
- Avoid setting the same expiration time for a large amount of data. We can add a random number when setting the expiration time to ensure that the data does not expire at the same time.
- Mutex lock: If it is found that the data being accessed is not in Redis, add a mutex lock to ensure that only one request constructs the cache (reads data from the database and updates the data to Redis) at the same time. Once the cache is built, release the lock. Requests that fail to acquire the mutex lock should either wait for the lock to be released and then re-read the cache or return a null value or default value directly. When implementing the mutex lock, set a timeout; otherwise, if a request holds the lock and encounters an unexpected situation, blocking indefinitely without releasing the lock, other requests will be unable to acquire the lock, leading to an unresponsive system.
- Background cache updates: The business thread no longer updates the cache. The cache does not need to be set with an expiration time, allowing the cache to be "permanently valid," and the task of updating the cache is handed over to a background thread for periodic updates.
Cache Breakdown#
Cache Breakdown: If certain hot data in the cache expires, and a large number of requests access this hot data, they will be unable to read from the cache and will directly access the database, causing the database to be overwhelmed by high concurrent requests.
Solutions:
- Mutex lock: Ensure that only one thread updates the cache at the same time. Requests that fail to acquire the mutex lock should either wait for the lock to be released or return a null value or default value directly.
- Do not set an expiration time for hot data. Instead, asynchronously update the cache in the background or notify the background thread to update the cache and reset the expiration time before the hot data is about to expire.
Cache Penetration#
Cache Penetration: When the data a user accesses is neither in the cache nor in the database, causing the request to fail when accessing the cache, and then when accessing the database, it is found that the data is also not present, making it impossible to build a cache to serve subsequent requests. When there are a large number of such requests, the pressure on the database will suddenly increase, leading to a database crash.
Solutions:
- Illegal request validation: Add parameter validation at the request entry point to check whether the parameters are reasonable and valid. If they are illegal parameters, return an error directly to avoid further access to the cache and database.
- Cache null values or default values: For certain data that needs to be queried, set a null value or default value in the cache, so that subsequent requests can directly read the null value or default value from the cache and return it to the application without further querying the database.
- Bloom filter: When a user request arrives, quickly check whether the data exists by querying the Bloom filter. If it does not exist, there is no need to query the database, ensuring the database operates normally.