Cache Me If You Can: A Practical Guide to Web Caching
Caching is one of the most powerful — and misunderstood — tools in a web developer's toolkit. It's often tossed around like a buzzword: "just cache it," they say. But what does that really mean? Where does the cache live? For how long? And why do so many engineers avoid it altogether?
In this post, I'll walk through caching from the ground up: from memoizing functions in JavaScript to edge caching with CDNs and Redis. If you've ever been confused by caching, or just want to finally get it right, this is for you.
1. In-Memory Caching (a.k.a. Memoization)
Let's start small. Say we're writing a recursive Fibonacci function:
function fib(n: number): number { if (n <= 1) return n; return fib(n - 1) + fib(n - 2); }
This is inefficient. Every time you call fib(40)
, it recomputes the same subproblems repeatedly.
Memoization fixes that:
const memo: Record<number, number> = {}; function fib(n: number): number { if (n in memo) return memo[n]; if (n <= 1) return n; memo[n]= fib(n - 1) + fib(n - 2); return memo[n]; }
Use this for CPU-heavy operations or data that doesn't change often and doesn't need to persist between requests.
2. Client-Side Caching (Browser)
On the frontend, we cache data to reduce API calls and improve perceived performance. This can happen in memory (e.g. React Query or Zustand), or in persistent storage like localStorage
or IndexedDB
.
Example: Cache API response in localStorage
async function getUserData() { const cached = localStorage.getItem('user'); if (cached) return JSON.parse(cached); const res = await fetch('/api/user'); const data = await res.json(); localStorage.setItem('user', JSON.stringify(data)); return data; }
Great for non-sensitive, infrequently changing data like settings, preferences, or static lookups.
3. Server-Side Caching
"Server caching" can mean several things:
- In-process memory (e.g., an LRU cache)
- Redis/Memcached
- File system cache
- Framework or route-level caching
Example: LRU cache in Node.js
import LRU from 'lru-cache'; const cache = new LRU({ max: 100 }); async function getPost(id: string) { if (cache.has(id)) return cache.get(id); const post = await db.getPost(id); cache.set(id, post); return post; }
Best used for read-heavy endpoints, product data, or any expensive computation with limited variation.
4. CDN and Edge Caching
This is where caching really scales. Use CDNs like Cloudflare, Fastly, or Akamai to cache static files or even full API responses closer to the user.
In frameworks like Next.js, this is easy to configure:
export const revalidate = 60; export async function GET() { const data = await fetch('https://api.example.com/data'); return Response.json(data); }
Or with manual cache headers:
return new Response(JSON.stringify(data), { headers: { 'Cache-Control': 'public, max-age=60, stale-while-revalidate=30', }, });
Ideal for static or semi-static content. Especially powerful when used at the edge.
5. Redis and External Caches
When your app runs across multiple servers or needs low-latency, shared caching, tools like Redis are perfect.
Example: Cache DB results in Redis
import { createClient } from 'redis'; const redis = createClient(); await redis.connect(); async function getProduct(id: string) { const cached = await redis.get(`product:${id}`); if (cached) return JSON.parse(cached); const product = await db.getProductById(id); await redis.set(`product:${id}`, JSON.stringify(product), { EX: 3600, }); return product; }
Common for API responses, feed data, user sessions, and anything you don't want to hit the database for repeatedly.
6. Database Query Caching
Some databases and ORMs allow query-level caching, materialized views, or extensions.
Even if your ORM (e.g., Prisma) doesn't support query caching natively, you can combine it with Redis or use a custom cache layer.
Tools and strategies:
- Materialized views (Postgres)
- Caching API layer (Hasura, GraphQL persisted queries)
- Redis-backed custom query caches
Database caching is ideal for expensive JOIN-heavy queries, analytics dashboards, or anything you know won't change frequently.
Summary: Think in Layers
Here's a quick reference for the different caching layers:
In-Memory
- Scope: Function-level
- Tools/Examples: Memoization, LRU cache
Client-side
- Scope: Per-user/session
- Tools/Examples: localStorage, SWR, React Query
Server-side
- Scope: Per-request or shared
- Tools/Examples: Redis, LRU, memory store
Edge/CDN
- Scope: Global
- Tools/Examples: Cloudflare, Fastly, Cache-Control
Database-level
- Scope: Query-specific
- Tools/Examples: Redis, Materialized Views, Hasura
Final Thoughts
Caching is all about trade-offs: speed vs freshness, simplicity vs correctness. The key is to cache as close to the user as you can, only as long as necessary, and to know when to invalidate.
Start with the slowest or most expensive operation in your app, and ask: "Can I cache this safely?"
That's where the real power of caching begins.