[!NOTE]
Database reads are famously slow. Writing frequently accessed data into fast memory (RAM) is the single most effective way to drop latency from 200ms down to 2ms. This is why every tech company at scale uses caching—it''s not optional, it''s survival.
What is a Cache?
A cache exploits the locality of reference principle: data that was accessed recently is likely to be accessed again soon. Instead of executing a complex 15-table SQL JOIN query every time a user logs in, you compute that data once and store it as a simple key-value pair inside an in-memory datastore like Redis or Memcached .
Because RAM access (~100 nanoseconds) is roughly 1,000x faster than an SSD read (~100 microseconds) and 100,000x faster than a spinning disk seek (~10 milliseconds), the speedup is dramatic and immediate.
Real-World Impact: Facebook
Facebook runs the world''s largest Memcached deployment—over 5,000 cache servers handling billions of requests per second . Without caching, every Facebook feed load would require dozens of database queries across their social graph. With caching, 99% of reads never touch the database at all. Facebook''s engineering team has said that removing their cache layer would require 10x more database servers than they currently run.
Real-World Impact: Twitter
Twitter uses Redis extensively to cache user timelines. When you open Twitter, you''re not querying a database—you''re reading a pre-computed list of tweet IDs from Redis. This is why Twitter can serve your home timeline in under 50 milliseconds despite having hundreds of millions of active users.
Caching Policies
When building your application, you must decide exactly how the App, Cache, and Database interact. The wrong choice can lead to stale data, cache stampedes, or wasted resources.
- Cache Aside (Lazy L…
Preview this lesson for free
Sign in to continue reading the full post.