Cache
- Good for data that's read frequently but modified infrequently
- It's a good practice to add an expiration policy for when to remove data from the cache
- LRU - evict the least-recently-used data
- Databases frequently have some sort of caching. You can tweak this for more specific use cases
Caching Queries
- This approach is tough with complex queries
- If one piece of data changes, you need to find and modify all queries that might include the cached data
Caching Objects
- Cache web pages, user sessions
- Cache objects with cache.set(key, json.dumps(object))
Write-Through
- The cache always sits between the db and the client - for reads and writes
- Writes go to the cache, the cache writes to db
Disadvantages
- The db could lose data if the cache goes down prior to writes
Hotkey Issue
- When one key receives a disproportional amount of traffic
- To resolve this, we append a random number to the key that we hash
Redis
- Set a TTL expiry on data
- Or setup LRU eviction
Redis as a Rate-Limiter
- Ex. is a service can handle 5 requests in a minute, then we use Redis with a 60s TTL and increment requests as they come in
Message Queues in Redis
- Consumer group - pointer into a stream