Cache
Dive into caching mechanisms to speed up system performance and user experience.

Published At
7/12/2022
Reading Time
~ 3 min read
A cache is a temporary storage area that stores the result of expensive responses or frequently access data in memory so that subsequent requests are served more quickly.
Every time a new web page loads, one or more database calls are executed to fetch data. The application is greatly affected by calling the database repeatedly. The cache can mitigate that problem.
Cache Tier
Cache tier is a temporary data store layer much faster than the database. The benefits of having a separate cache tier include:
- Better system performance
- Ability to reduce database workloads
- Ability to scale the cache tier independently
Considerations for using cache
-
A cache server is not ideal for persisting data when a database is read frequently but infrequently modified since cache data is stored in volatile memory.
-
Expiration policy: Itβs a good practice to set when the data expires. When not set, data will store in memory permanently.
- Not too short: Have to reload data too frequently.
- Not too long: Data will be stale.
-
Consistency: This involves keeping the data store and cache in sync. Inconsistency can happen because data modifications on the data store and cache are not in a single transaction. When scaling across multiple regions, maintaining consistency between the data store and cache is challenging.
-
Mitigating failures: A single cache server represents a potential Single Point Of Failure (SPOF)
- A Single Point of Failure is a part of the system that, if it fails, will stop the entire system from working.
- As a result, multiple cache servers across different data centers are recommended to avoid SPOF. Another approach is to over-provision the required memory by a certain percentage. This provides a buffer as memory usage increases.
-
Eviction Policy: Once the cache is complete, any request to add to the store might cause existing items to be removed. This is called cache eviction. Least Recently Used (LRU) is the most famous cache eviction policy. Other eviction policies are:
- Least Frequency Used (LFU)
- First in First Out (FIFO)
Any of them can be adopted according to the application use cases.
π₯
Do you have any questions, or simply wish to contact me privately? Don't hesitate to shoot me a DM on Twitter.
Have a wonderful day.
Abhishek π