Distributed Cache
A distributed cache stores frequently accessed data in memory across multiple servers, enabling sub-millisecond data retrieval. In system design interviews, caching appears in nearly every architecture—it's how you achieve the performance that users expect from modern applications.
This page covers what you need to know for interviews: why caching matters, cache writing and eviction policies, how to design a distributed cache from scratch, and the trade-offs between popular solutions like Redis and Memcached. Understanding these concepts helps you design systems that handle millions of requests without overwhelming your database.