Introduction: The Final Piece of the Puzzle
We’ve spent the last nine posts in this series unraveling the layers of caching in Next.js—from static page generation to API response caching, image optimization, and CDN strategies. Each piece was essential for squeezing the most performance out of your application.
But there’s one more layer left to explore: server-side caching versus client-side caching.
This is where many developers get tripped up—because it’s not a question of either/or. It’s about understanding when and where each type of caching shines, and how you can combine them to deliver a fast, seamless user experience without sacrificing flexibility.
In this final post, we’re going to zoom out and look at the big picture:
- What exactly is the difference between server-side and client-side caching?
- When should you reach for Redis or a CDN, and when is SWR or React Query the better choice?
- How can you combine both to build a robust, scalable Next.js app that feels fast for everyone, everywhere?
Let’s close this series with a practical guide you’ll actually use in the wild.
1. The Core Difference: Server-Side vs. Client-Side Caching
When we talk about caching in web development, it’s easy to lump everything under one big umbrella. But in reality, server-side caching and client-side caching solve very different problems—and knowing when to use each is the difference between a performant app and a sluggish one.
Let’s break it down.
Server-Side Caching: The Backend Booster
This happens before the response ever leaves the server. It’s about reducing the load on your backend—whether it’s your database, API, or any computationally heavy logic.
Examples:
- Redis: Store pre-computed API responses, like product lists, so you don’t hit your database on every request.
- CDNs: Cache full HTML pages, static assets, or API responses at the edge for ultra-fast delivery.
-
SSR Page Caching: Cache the output of
getServerSideProps
so you don’t regenerate it on every request.
Ideal for:
- Shared data across users (think blog posts, product catalogs)
- Expensive-to-generate content (heavy DB queries, data aggregation)
- APIs that don’t change frequently (e.g., site settings, categories)
Client-Side Caching: The User Experience Glue
This happens in the browser. It’s about keeping data fresh and interactions fast once the page is loaded.
Examples:
- SWR / React Query: Store API results in-memory for instant re-rendering across components.
- Browser Cache: Static assets like images and scripts can be cached locally via headers.
Ideal for:
- User-specific data (profile info, cart, wishlist)
- Data that needs frequent updates per user (notifications, dashboard stats)
- Interactive UI elements that don’t require a server roundtrip every time
Bottom line:
- Server-side caching reduces server load and speeds up response times for everyone.
- Client-side caching makes your app feel snappy and responsive for individual users.
Both are critical—but they serve very different purposes. Let’s dive deeper into when to use each.
2. When to Use Server-Side Caching
If you’ve ever watched your server logs light up like a Christmas tree every time someone hits the homepage or browses products, you know the pain of uncached data. That’s where server-side caching shines. It’s about caching once for many users—taking the load off your backend and speeding up responses for everyone.
Here’s when server-side caching is the right call:
Shared Data Across Users
Think of product catalogs, blog posts, category pages—data that’s the same for all users. It doesn’t make sense to regenerate this for every request. Cache it at the server level (Redis) or even better, push it to a CDN and let the edge do the work.
Heavy Database Queries
Imagine you’re running an e-commerce site with thousands of products, filtering options, and categories. Every time someone hits your /api/products
endpoint, you’re querying the DB, sorting, filtering, paginating... It adds up.
Cache the result in Redis with a sensible TTL (say, 60 seconds). You’ve just shaved precious milliseconds off every request.
Here’s a quick Redis API caching example:
// /pages/api/products.js
import redis from "@/lib/redis";
import { getProducts } from "@/lib/db";
export default async function handler(req, res) {
const cached = await redis.get("products");
if (cached) {
console.log("Serving from cache");
return res.json(JSON.parse(cached));
}
const products = await getProducts(); // DB call
await redis.set("products", JSON.stringify(products), "EX", 60); // 60 sec cache
console.log("Serving from DB");
res.json(products);
}
Full-Page SSR Caching
If you’re using getServerSideProps
for pages like blog posts or landing pages, consider caching the entire HTML output. With Vercel Middleware or CDN rules, you can cache those pages at the edge so they don’t have to be regenerated every time.
The goal of server-side caching is simple:
Cache shared data, offload heavy work, and make the server breathe easy.
3. When to Use Client-Side Caching
Alright—so we’ve covered how server-side caching handles shared data and takes the load off your backend. But what about user-specific data—the stuff that’s dynamic, per-user, and often needs to feel instant? That’s where client-side caching steps in.
Think of client-side caching as a performance enhancer for the user experience. It’s about keeping the browser “smart” and avoiding unnecessary network requests whenever possible.
Here’s when you absolutely should use it:
User-Specific Data
Data that’s unique to the user—like a profile, wishlist, or cart—doesn’t belong in a shared server cache. Why? Because it’s personal.
Instead, let the client manage it. Use libraries like SWR or React Query to fetch and cache this data in the browser memory.
Example: Fetching the logged-in user’s profile info using SWR:
import useSWR from "swr";
const fetcher = (url) => fetch(url).then((res) => res.json());
export default function Profile() {
const { data, error } = useSWR("/api/user", fetcher);
if (error) return <div>Failed to load</div>;
if (!data) return <div>Loading...</div>;
return <div>Hello, {data.name}!</div>;
}
Now, every time the user navigates to a page that uses this data, it’s already there—no refetching needed.
Frequent, Non-Critical Data
Not every piece of data needs to be fresh on every request. For instance:
- Notifications
- Activity feeds
- Dashboard summaries
With SWR or React Query, you can configure stale-while-revalidate behavior:
- Show cached data instantly
- Fetch the latest data in the background
- Update the UI once it’s ready
This is the “optimistic” UX users love—it feels fast even when the data is still loading behind the scenes.
UI State That Doesn’t Belong in Global State
Need to persist form data, filters, or scroll positions? That’s a perfect use case for client-side caching. Keep it in memory—there’s no need to involve the server.
Bottom line:
- Client-side caching is for personal, fast-changing data.
- It shines when you want instant UI updates without hammering your API on every page load.
4. Combining Server and Client Caching for Best Results
Here’s the thing: server-side caching and client-side caching aren’t competitors—they’re teammates. The best-performing Next.js apps use both, playing to their strengths.
Let’s talk real-world architecture:
A Hybrid Caching Strategy That Works
Server-Side Caching for Shared Data
Use Redis (or a CDN) to cache API responses that are the same for everyone—product lists, categories, static settings. That way, you avoid expensive database hits for data that changes infrequently.
Client-Side Caching for Personalized Data
Use SWR or React Query for per-user data—like carts, user profiles, wishlists, or anything that changes based on who’s logged in. Let the browser handle those requests, and revalidate only when needed.
A Real Example: E-commerce
Let’s say you have an online store:
- Product listings? Cached in Redis (shared across all users).
- Individual product details? Cached per product in Redis for fast API responses.
- User’s cart? Client-side cache with SWR or React Query.
- User profile, orders, or preferences? Client-side cache.
This way, your server stays lean, and the user experience feels lightning-fast.
SSR + SWR: The Dream Team
For pages rendered with getServerSideProps
, you can also layer client-side caching on top:
- Cache the SSR HTML at the edge or with Redis.
- On the client, use SWR to keep the data fresh in the background.
It’s the best of both worlds:
Fast initial load from SSR
Fresh, dynamic updates from SWR
Why This Matters
Without a hybrid strategy, you risk either:
- Overloading your server (if you skip server-side caching)
- Or making your app feel sluggish and unresponsive (if you skip client-side caching).
By combining both, you get the performance boost of shared caching plus the responsiveness of personalized data.
5. Best Practices Recap
Let’s keep this tight. If you remember nothing else, remember these core principles:
For Server-Side Caching:
- Always set a time-to-live (TTL) for your cached data. Stale cache can cause more harm than good.
- Cache shared data only—things like product catalogs, blog posts, or API responses that are the same for all users.
- Use Redis, CDN edge caching, or middleware solutions for best results.
- Monitor your cache hit rates. If your cache isn’t being hit often, you’re not getting the performance gains you expect.
For Client-Side Caching:
- Use client-side libraries like SWR or React Query to handle user-specific data—carts, profiles, dashboards.
- Leverage stale-while-revalidate patterns to keep the UI fast while refreshing data in the background.
- Don’t cache everything blindly—only cache what actually benefits from it.
- Watch your cache scopes. Keep them small and relevant to specific components or pages.
For Both:
- Don’t rely on one type of caching alone.
- Combine server-side and client-side caching for a layered, resilient system.
- Review and adjust caching strategies as your app grows. What works for 1,000 users may not hold up at 100,000.
Caching is a powerful tool, but like any tool, it’s only as good as how you use it.
Conclusion: Cache Smart, Not Hard
Caching is not about guesswork or shortcuts—it’s about knowing your data, understanding your app’s behavior, and being intentional about where and how you store things.
Here’s the bottom line:
- Server-side caching (Redis, CDNs) is your weapon for shared data and heavy-lifting backend tasks.
- Client-side caching (SWR, React Query) is the key to snappy, user-focused experiences in the browser.
- When used together, they create a seamless, resilient system that performs under pressure and scales with your app.
This series has been all about building that foundation—understanding caching from static pages to API responses, image optimization, edge strategies, and now, combining server and client-side caching in a way that just works.
If you’ve followed along, you now have a caching strategy that’s practical, battle-tested, and built for the real world.
Thanks for reading, and remember: performance is a team effort—code, infrastructure, and strategy all matter.
This Concludes Our Caching Series
Thanks for joining me on this deep dive into caching in Next.js. I hope you’ve found it insightful, practical, and worth the time.
Check out the full series below to revisit any topics you’d like to explore again. Let’s stay connected—drop me a message or connect on LinkedIn.
Top comments (0)