Skip to content
QuizMaker logoQuizMaker
Activity
System Design: The Complete Guide
1. System Design Basics
1. Introduction to System Design
2. Vertical vs Horizontal Scaling
3. Load Balancing
4. Caching Strategies
5. CDNs (Content Delivery Networks)
6. SQL vs NoSQL
7. Database Sharding & Partitioning
8. The CAP Theorem
9. Microservices Architecture
10. Message Queues & Event Streaming
12. Design BookMyShow (Ticket Booking)
14. Design Dropbox (Cloud File Storage)
15. How to Approach Any System Design Interview
16. Back-of-the-Envelope Estimation
17. Consistent Hashing
18. Bloom Filters & Probabilistic Data Structures
19. Database Replication
20. Leader Election & Consensus (Raft & Paxos)
21. Distributed Transactions (Saga, 2PC, Outbox)
22. Event Sourcing & CQRS
23. Unique ID Generation at Scale
24. Rate Limiting Algorithms
25. Circuit Breakers & Bulkhead Pattern
26. API Gateway, Proxies & Service Mesh
27. Real-Time Communication
28. Observability (Tracing, Logging, SLOs)
30. Design a Chat System (WhatsApp)
31. Design YouTube (Video Streaming)
32. Design a Web Crawler

4. Caching Strategies

Speeding up reads using RAM layers.

Feb 22, 202622 views0 likes0 fires
18px

[!NOTE]
Database reads are famously slow. Writing frequently accessed data into fast memory (RAM) is the single most effective way to drop latency from 200ms down to 2ms. This is why every tech company at scale uses caching—it''s not optional, it''s survival.

What is a Cache?

A cache exploits the locality of reference principle: data that was accessed recently is likely to be accessed again soon. Instead of executing a complex 15-table SQL JOIN query every time a user logs in, you compute that data once and store it as a simple key-value pair inside an in-memory datastore like Redis or Memcached .

Because RAM access (~100 nanoseconds) is roughly 1,000x faster than an SSD read (~100 microseconds) and 100,000x faster than a spinning disk seek (~10 milliseconds), the speedup is dramatic and immediate.

Real-World Impact: Facebook

Facebook runs the world''s largest Memcached deployment—over 5,000 cache servers handling billions of requests per second . Without caching, every Facebook feed load would require dozens of database queries across their social graph. With caching, 99% of reads never touch the database at all. Facebook''s engineering team has said that removing their cache layer would require 10x more database servers than they currently run.

Real-World Impact: Twitter

Twitter uses Redis extensively to cache user timelines. When you open Twitter, you''re not querying a database—you''re reading a pre-computed list of tweet IDs from Redis. This is why Twitter can serve your home timeline in under 50 milliseconds despite having hundreds of millions of active users.

Caching Policies

When building your application, you must decide exactly how the App, Cache, and Database interact. The wrong choice can lead to stale data, cache stampedes, or wasted resources.

  1. Cache Aside (Lazy L…

QuizMaker

Preview this lesson for free

Sign in to continue reading the full post.

Log in required
This lesson is available for logged-in users only.
No spam. Continue where you left off after signing in.

Share this article

Share on TwitterShare on LinkedInShare on FacebookShare on WhatsAppShare on Email

Test your knowledge

Take a quick quiz based on this chapter.

mediumSystem Design
Quiz: Caching
5 questions5 min

Continue Learning

5. CDNs (Content Delivery Networks)

Beginner
10 min

6. SQL vs NoSQL

Intermediate
16 min

7. Database Sharding & Partitioning

Intermediate
14 min
Lesson 4 of 5 in 1. System Design Basics
Previous in 1. System Design Basics
3. Load Balancing
Next in 1. System Design Basics
5. CDNs (Content Delivery Networks)
← Back to System Design: The Complete Guide
Back to System Design: The Complete GuideAll Categories