<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Ivan Alekseev</title>
    <description>The latest articles on Forem by Ivan Alekseev (@xvandev).</description>
    <link>https://forem.com/xvandev</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/xvandev"/>
    <language>en</language>
    <item>
      <title>Deploy NextJs and NestJs as a single application</title>
      <dc:creator>Ivan Alekseev</dc:creator>
      <pubDate>Mon, 09 Sep 2024 09:07:15 +0000</pubDate>
      <link>https://forem.com/xvandev/deploy-nextjs-and-nestjs-as-a-single-application-15mj</link>
      <guid>https://forem.com/xvandev/deploy-nextjs-and-nestjs-as-a-single-application-15mj</guid>
      <description>&lt;p&gt;Hey there! I’m excited to share how you can configure NestJS to work seamlessly on a single host. But first, let me explain why this setup has been my top choice for managing both frontend and backend for so long.&lt;/p&gt;

&lt;p&gt;Next.js is a powerhouse when it comes to kickstarting new projects. It comes packed with features like built-in routing, server-side rendering (SSR), and caching that help you hit the ground running. Plus, Next.js has its own internal API capabilities, letting you manage tasks like caching and data prep right within the framework. This means you can focus more on building your app and less on setting up the infrastructure.&lt;/p&gt;

&lt;p&gt;But sometimes you need something more powerful for the server. That’s where Nest.js steps in. This framework is so powerful that it can handle not just the middleware duties between your backend and frontend, but can also act as a robust backend solution all on its own. Therefore NestJS is a good addition to Next.js in this case allowing using a single programming language for frontend and backend.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why a single host?
&lt;/h2&gt;

&lt;p&gt;Simply put, it’s incredibly convenient. With just a git pull and a docker-compose up -d, you’re ready to go. There is no need to worry about CORS or juggling ports. Plus, it streamlines the delivery process, making everything run more smoothly and efficiently. As a disadvantage, I can point out that this does not suit big projects with a high load.&lt;/p&gt;

&lt;h2&gt;
  
  
  1. First, let's define the folder structure of your repository
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frcfvb7kahhij99eqnsgk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frcfvb7kahhij99eqnsgk.png" alt="Image description" width="800" height="416"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Let's declare a docker file for the server
&lt;/h2&gt;

&lt;p&gt;File: ./docker-compose.yml&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;services:
    nginx:
        image: nginx:alpine
        ports:
            - "80:80"
        volumes:
            - "./docker/nginx/conf.d:/etc/nginx/conf.d"
        depends_on:
            - frontend
            - backend
        networks:
            - internal-network
            - external-network

    frontend:
        image: ${FRONTEND_IMAGE}
        restart: always
        networks:
            - internal-network

    backend:
        image: ${BACKEND_IMAGE}
        environment:
            NODE_ENV: ${NODE_ENV}
            POSTGRES_HOST: ${POSTGRES_HOST}
            POSTGRES_USER: ${POSTGRES_USER}
            POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
            POSTGRES_DB: ${POSTGRES_DB}
        depends_on:
            - postgres
        restart: always
        networks:
            - internal-network

    postgres:
        image: postgres:12.1-alpine
        container_name: postgres
        volumes:
            - "./docker/postgres:/var/lib/postgresql/data"
        environment:
            POSTGRES_USER: ${POSTGRES_USER}
            POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
            POSTGRES_DB: ${POSTGRES_DB}
        ports:
            - "5432:5432"

networks:
    internal-network:
        driver: bridge

    external-network:
        driver: bridge
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Simply put, it’s incredibly convenient. With just a git pull and a docker-compose up -d, you’re ready to go. There is no need to worry about CORS or juggling ports. Plus, it streamlines the delivery process, making everything run more smoothly and efficiently. As a disadvantage, I can point out that this does not suit big projects with a high load.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Another docker file for development mode
&lt;/h2&gt;

&lt;p&gt;For development mode, we don’t need container service for the backend and frontend because we will run them locally.&lt;/p&gt;

&lt;p&gt;File: ./docker-compose.dev.yml&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: '3'

services:
    nginx:
        image: nginx:alpine
        ports:
            - "80:80"
        volumes:
            - "./docker/nginx/conf.d:/etc/nginx/conf.d"

    postgres:
        image: postgres:12.1-alpine
        container_name: postgres
        volumes:
            - "./docker/postgres:/var/lib/postgresql/data"
        environment:
            POSTGRES_USER: postgres
            POSTGRES_PASSWORD: postgres
            POSTGRES_DB: postgres
        ports:
            - "5432:5432"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  4. Docker file for backend
&lt;/h2&gt;

&lt;p&gt;File: ./backend/Dockerfile&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM node:18-alpine AS deps
RUN apk add --no-cache libc6-compat
WORKDIR /app

COPY package.json package-lock.json ./
RUN  npm install

FROM node:18-alpine AS builder
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY . .

ENV NEXT_TELEMETRY_DISABLED 1

RUN npm run build

FROM node:18-alpine AS runner
WORKDIR /app

ENV NODE_ENV production
ENV NEXT_TELEMETRY_DISABLED 1

RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs

COPY --from=builder --chown=nextjs:nodejs /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/package.json ./package.json

RUN mkdir -p /app/backups &amp;amp;&amp;amp; chown -R nextjs:nodejs /app/backups &amp;amp;&amp;amp; chmod -R 777 /app/backups

USER nextjs

EXPOSE 3010

ENV PORT 3010

CMD ["node", "dist/src/main"]

## 5. Docker file for frontend
File: ./frontend/Dockerfile

FROM node:18-alpine AS deps
RUN apk add --no-cache libc6-compat
WORKDIR /app

COPY package.json package-lock.json ./
RUN  npm install

FROM node:18-alpine AS builder
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY . .

ENV NEXT_TELEMETRY_DISABLED 1

RUN npm run build

FROM node:18-alpine AS runner
WORKDIR /app

ENV NODE_ENV production
ENV NEXT_TELEMETRY_DISABLED 1

RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs

COPY --from=builder --chown=nextjs:nodejs /app/.next ./.next
COPY --from=builder --chown=nextjs:nodejs /app/public ./public
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/package.json ./package.json

USER nextjs

EXPOSE 3000

ENV PORT 3000

CMD ["npm", "start"]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  6. Ngnix configuration
&lt;/h2&gt;

&lt;p&gt;In this step, we configure Nginx to act as a reverse proxy for our Next.js frontend and Nest.js backend. The Nginx configuration allows you to route requests seamlessly between the frontend and backend, all while serving them from the same host.&lt;/p&gt;

&lt;p&gt;File: /docker/nginx/conf.d/default.conf&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;server {
    listen 80;

    location / {
        proxy_pass http://host.docker.internal:3000;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }

    location /api {
        proxy_pass http://host.docker.internal:3010;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This configuration listens on port 80 and routes general traffic to the Next.js frontend on port 3000, while any requests to /api are forwarded to the Nest.js backend on port 3010.&lt;/p&gt;

&lt;h2&gt;
  
  
  7. NestJs global pregix
&lt;/h2&gt;

&lt;p&gt;Since we use the same host we need NestJs to be available on /apipath. To do this we need to setGlobalPrefix — API.&lt;/p&gt;

&lt;p&gt;File: ./backend/src/main.js&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.create(AppModule, { cors: true  });
  app.setGlobalPrefix('api');
  await app.listen(3010);
}
bootstrap();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  8. Frontend
&lt;/h2&gt;

&lt;p&gt;No configuration is required on the frontend but only taking into account that all the server requests should be called relative to /api path.&lt;/p&gt;

&lt;h2&gt;
  
  
  9. Run locally
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;cd frontend&lt;br&gt;
npm run dev&lt;br&gt;
cd ../backend&lt;br&gt;
npm run start:dev&lt;br&gt;
cd ../&lt;br&gt;
docker-compose -f docker-compose.dev.yml up -d&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Now, we can check our website by opening localhost in the browser. In the example, we have 1 request on the server and another on the client. Both these requests are called from the Next.Js and processed by Nest.Js.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7y7tqtzkowosltlcozv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7y7tqtzkowosltlcozv.png" alt="Image description" width="800" height="485"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  10. Deploy and run on the server via GitHub
&lt;/h2&gt;

&lt;p&gt;This article explores how to deploy a project to a server using Docker Registry and GitHub Actions. The process begins with creating Docker images for both the backend and frontend in the Docker Registry. After that, you’ll need to set up a GitHub repository and configure the necessary secrets for seamless deployment:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;DOCKERHUB_USERNAME&lt;br&gt;
DOCKERHUB_TOKEN&lt;br&gt;
DOCKER_FRONTEND_IMAGE&lt;br&gt;
DOCKER_BACKEND_IMAGE&lt;br&gt;
REMOTE_SERVER_HOST&lt;br&gt;
REMOTE_SERVER_USERNAME&lt;br&gt;
REMOTE_SERVER_SSH_KEY&lt;br&gt;
REMOTE_SERVER_SSH_PORT&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The backside of using one repository for the backend and frontend is that each time you push something both images are rebuilt. To optimize it we can use these conditions:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;if: contains(github.event_name, ‘push’) &amp;amp;&amp;amp; !startsWith(github.event.head_commit.message, ‘frontend’)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;if: contains(github.event_name, ‘push’) &amp;amp;&amp;amp; !startsWith(github.event.head_commit.message, ‘backend’)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It makes it possible to rebuild only the image you heed by specifying the commit message.&lt;/p&gt;

&lt;p&gt;File: ./github/workflows/deploy.yml&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: deploy nextjs and nestjs to GITHUB

on:
  push:
    branches: [ "main" ]

jobs:
  build-and-push-frontend:
    runs-on: ubuntu-latest

    if: contains(github.event_name, 'push') &amp;amp;&amp;amp; !startsWith(github.event.head_commit.message, 'backend')

    steps:
      - name: Checkout
        uses: actions/checkout@v3

      - name: Login to Docker Hub
        uses: docker/login-action@v1
        with:
          username: ${{ secrets.DOCKERHUB_USERNAME }}
          password: ${{ secrets.DOCKERHUB_TOKEN }}

      - name: Build and push frontend to Docker Hub
        uses: docker/build-push-action@v2
        with:
          context: frontend
          file: frontend/Dockerfile
          push: true
          tags: ${{ secrets.DOCKER_FRONTEND_IMAGE }}:latest

      - name: SSH into the remote server and deploy frontend
        uses: appleboy/ssh-action@master
        with:
          host: ${{ secrets.REMOTE_SERVER_HOST }}
          username: ${{ secrets.REMOTE_SERVER_USERNAME }}
          password: ${{ secrets.REMOTE_SERVER_SSH_KEY }}
          port: ${{ secrets.REMOTE_SERVER_SSH_PORT }}
          script: |
            cd website/
            docker rmi -f ${{ secrets.DOCKER_FRONTEND_IMAGE }}:latest
            docker-compose down
            docker-compose up -d

  build-and-push-backend:
    runs-on: ubuntu-latest

    if: contains(github.event_name, 'push') &amp;amp;&amp;amp; !startsWith(github.event.head_commit.message, 'frontend')

    steps:
      - name: Checkout
        uses: actions/checkout@v3

      - name: Login to Docker Hub
        uses: docker/login-action@v1
        with:
          username: ${{ secrets.DOCKERHUB_USERNAME }}
          password: ${{ secrets.DOCKERHUB_TOKEN }}

      - name: Build and push backend to Docker Hub
        uses: docker/build-push-action@v2
        with:
          context: backend
          file: backend/Dockerfile
          push: true
          tags: ${{ secrets.DOCKER_BACKEND_IMAGE }}:latest

      - name: SSH into the remote server and deploy backend
        uses: appleboy/ssh-action@master
        with:
          host: ${{ secrets.REMOTE_SERVER_HOST }}
          username: ${{ secrets.REMOTE_SERVER_USERNAME }}
          password: ${{ secrets.REMOTE_SERVER_SSH_KEY }}
          port: ${{ secrets.REMOTE_SERVER_SSH_PORT }}
          script: |
            cd website/
            docker rmi -f ${{ secrets.DOCKER_BACKEND_IMAGE }}:latest
            docker-compose down
            docker-compose up -d=
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Repository: &lt;a href="https://github.com/xvandevx/blog-examples/tree/main/nextjs-nestjs-deploy" rel="noopener noreferrer"&gt;https://github.com/xvandevx/blog-examples/tree/main/nextjs-nestjs-deploy&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Recap
&lt;/h2&gt;

&lt;p&gt;This article is a hands-on guide to deploying Next.js and Nest.js together on a single server, making it a go-to solution for developers who want a streamlined setup. By combining the strengths of Next.js for frontend and Nest.js for backend, I showed how to efficiently manage both parts of your application using Docker and GitHub Actions. It simplifies the deployment process, allowing you to focus on building your app rather than juggling multiple configurations. Perfect for those looking to get a full-stack project up and running quickly with minimal hassle.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>nextjs</category>
      <category>nestjs</category>
      <category>cicd</category>
    </item>
    <item>
      <title>NextJs SSR caching using Redis</title>
      <dc:creator>Ivan Alekseev</dc:creator>
      <pubDate>Mon, 22 Jul 2024 19:57:54 +0000</pubDate>
      <link>https://forem.com/xvandev/nextjs-ssr-caching-using-redis-384j</link>
      <guid>https://forem.com/xvandev/nextjs-ssr-caching-using-redis-384j</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;This is my first article, and I would like to explore the topic of caching using Redis and SSR (Server-Side Rendering). I’ll explain the importance of caching, provide an example of a basic implementation, and discuss some of the challenges I’ve faced in my projects. I will use NextJs since it has an out-of-the-box implementation of SSR.&lt;/p&gt;

&lt;h2&gt;
  
  
  Caching
&lt;/h2&gt;

&lt;p&gt;Firstly, let’s discuss the significance of caching. Caching is crucial for enhancing the performance of your project and minimizing database requests. It plays a vital role in improving response times, reducing server load, and ensuring a smoother user experience.&lt;/p&gt;

&lt;p&gt;If you are working on a public website, caching is an essential feature. Regardless of how fast your server and database are, they can become bottlenecks under high load.&lt;/p&gt;

&lt;p&gt;Moreover, there is a direct dependency of site traffic on page loading speed due to Google taking this into account when indexing, putting faster sites in higher positions. That’s why caching and site performance are crucial for achieving success. In large projects, every millisecond counts because even slight delays can result in significant financial losses.&lt;/p&gt;

&lt;h2&gt;
  
  
  Caching types
&lt;/h2&gt;

&lt;p&gt;There are several types of caching like API caching, HTML caching, CDN, and frontend caching(memoization, local storage, etc). Every project should take an exact set of tools according to their unique requirements, loads, and other features. It is crucial to choose an appropriate set of tools in order to not increase the project’s complexity and the costs of its maintenance.&lt;/p&gt;

&lt;p&gt;In this article, I want to pick out exact API caching using Redis and SSR due to it helps to reduce server response which has a crucial role in optimization. It does not matter if your frontend code performs when the server slows down. In this way more often the client just closes the page and the site will go down in Google positions.&lt;/p&gt;

&lt;p&gt;There are also tools like Google Page Insights and Light House which also give key importance to the server response speed.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why do we need SSR?
&lt;/h2&gt;

&lt;p&gt;Maybe some of you have a question — why do we need to make a request on the server? Why just don’t give an HTML template to the client and do all the jobs there — fetch the data and build the page? And I can say — yes, this is ok, but only when you have a website that does not require search engine indexing. Meanwhile, If you want to index your website and increase search traffic, your content must be included in the HTML returned by the server. Google favors websites with server-side rendered content over those that rely heavily on client-side content loading.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Redis?
&lt;/h2&gt;

&lt;p&gt;Redis is my favorite tool because of its ease of use and flexibility. Among the many advantages of using Redis for caching, several stand out. It allows for dynamic state management, enabling efficient tracking and selective deletion of data. Redis also supports database migration to a separate server and facilitates sharding for optimized data distribution. Its capability for smart cache implementation, which I’ll elaborate on in my next article, is another key benefit. Additionally, Redis makes it easy to create backups, which is crucial for quickly restoring large databases.&lt;/p&gt;

&lt;p&gt;So, let’s start implementing. Let’s assume you have already installed the Next application and there is some list of entities that you get from the server and you want to cache it.&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Redis installation
&lt;/h2&gt;

&lt;p&gt;You can run Redis on any server or install it directly on your computer. Personally, I prefer using Docker:&lt;/p&gt;

&lt;p&gt;File: .docker-compose.yml&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;services:
  redis:
    image: 'bitnami/redis:latest'
    environment:
      - REDIS_PASSWORD=password
      - REDIS_AOF_ENABLED=no
    volumes:
      - "./docker/redis:/bitnami"
    ports:
      - "6379:6379"
  redisinsight:
    container_name: redisinsight
    image: redislabs/redisinsight
    ports:
      - "8081:8001"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  2. Redis setup
&lt;/h2&gt;

&lt;p&gt;To begin, let’s create an env file to store our connection settings.&lt;/p&gt;

&lt;p&gt;File: .env&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;REDIS_HOST=localhost
REDIS_PASSWORD=password
REDIS_PORT=6379
And set up Redis client.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;File: lib/redis.js&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import Redis from "ioredis";

const options = {
    port: process.env.REDIS_PORT,
    host: process.env.REDIS_HOST,
    password: process.env.REDIS_PASSWORD
};

const client = new Redis(options);
export default client;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  3. API function
&lt;/h2&gt;

&lt;p&gt;Next, we’ll develop a straightforward API that utilizes the Redis wrapper:&lt;/p&gt;

&lt;p&gt;File: api/index.js&lt;/p&gt;

&lt;p&gt;import catalog from "./catalog";&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const config = {
    API_URL: 'http://localhost:3000/api/catalog'
}
export const Api = {
    catalog: catalog(config)
}
File: api/catalog.js

import {redisGetHandler} from "../utils/redis";

export default (config) =&amp;gt; ({
    async getItems(Redis = null, reset = false) {
        return await redisGetHandler(Redis, config.API_URL, 'getItems', reset);
    }
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  4. Redis handler function
&lt;/h2&gt;

&lt;p&gt;Next, we need to create a handler function for the API that can both store and retrieve data from Redis.&lt;/p&gt;

&lt;p&gt;File: utils/redis.js&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import axios from "axios";

export async function redisGetHandler(Redis, apiPath, apiCode, reset = false) {
    const url = `${apiPath}/${apiCode}`;
    if (!Redis) {
        return await fetchData(url);
    }

    try {
        if (!reset) {
            const cachedData = await Redis.get(apiCode);
            if (cachedData) {
                return JSON.parse(cachedData);
            }
        }   
        const serverData = await fetchData(url);
        if (serverData) {
            Redis.set(apiCode, JSON.stringify(serverData));
        }
        return serverData;
    } catch(e) {
        console.log(e);
    }
    return await fetchData(url);
}

async function fetchData(url) {
    try {
        const {data} = await axios.get(url);
        return data;
    } catch (e) {
        console.log(e);
    }
    return null;
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  5. Page component and getServerSideProps function
&lt;/h2&gt;

&lt;p&gt;Let’s integrate API into the serverFunction. It is essential to use Redis in the server functions, as Redis is specifically designed to operate on a server. Failing to do so will result in an error on the client side.&lt;/p&gt;

&lt;p&gt;File: pages/index.js&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import {useEffect, useState} from "react";
import {Api} from "../api";
import Redis from "../lib/redis";

export default function Home({items: serverItems}) {
    const [clientItems, setClientItems] = useState([]);
    const [isClientItemsLoading, setIsClientItemsLoading] = useState(true);

    const getClientItems = async () =&amp;gt; {
        setIsClientItemsLoading(true);
        setClientItems(await Api.catalog.getItems());
        setIsClientItemsLoading(false);
    }

    useEffect(() =&amp;gt; {
        getClientItems();
    }, []);

    return (
        &amp;lt;div&amp;gt;
            &amp;lt;h3&amp;gt;Items from client&amp;lt;/h3&amp;gt;
            {isClientItemsLoading &amp;amp;&amp;amp; (&amp;lt;div&amp;gt;Loading...&amp;lt;/div&amp;gt;)}
            {!isClientItemsLoading &amp;amp;&amp;amp; &amp;lt;ul&amp;gt;
                {clientItems.map(item =&amp;gt; &amp;lt;li key={item.id}&amp;gt;{item.name}&amp;lt;/li&amp;gt;)}
            &amp;lt;/ul&amp;gt;}
            &amp;lt;h3&amp;gt;Items from server&amp;lt;/h3&amp;gt;
            &amp;lt;ul&amp;gt;
                {serverItems.map(item =&amp;gt; &amp;lt;li key={item.id}&amp;gt;{item.name}&amp;lt;/li&amp;gt;)}
            &amp;lt;/ul&amp;gt;
        &amp;lt;/div&amp;gt;
    );
}

export async function getServerSideProps({query}) {
    const reset = Boolean(query.reset);
    const items = await Api.catalog.getItems(Redis, reset)
    return {
        props: {
            items
        }
    };
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  6. Result
&lt;/h2&gt;

&lt;p&gt;In summary, here’s the scheme: we have two API calls, one on the server and one on the client.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj9kf383pb4643mwwhr7c.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj9kf383pb4643mwwhr7c.jpg" alt="Image description" width="800" height="474"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;repository: &lt;a href="https://github.com/xvandevx/blog-examples/tree/main/nestjs-cache" rel="noopener noreferrer"&gt;https://github.com/xvandevx/blog-examples/tree/main/nestjs-cache&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You might have noticed that our requests are cached only on the server. When the API is called from the client, Redis is not utilized because it doesn’t work on the client side. This was an issue I encountered in my project when I first set up caching. The reason is that using Redis directly on the client is unsafe, and there isn’t a front-end library that supports this functionality.&lt;/p&gt;

&lt;p&gt;So, Next.JS can help us fix it. It allows us to create an internal API wrapper within our application. Here’s how we can proceed:&lt;/p&gt;

&lt;p&gt;To implement it we just need a few changes in our code.&lt;/p&gt;

&lt;h2&gt;
  
  
  7. API function
&lt;/h2&gt;

&lt;p&gt;We need to implement API_WRAPPER_URL and getItemsWrapper function.&lt;/p&gt;

&lt;p&gt;File: api/index.js&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import catalog from "./catalog";

const config = {
    API_URL: '&amp;lt;http://localhost:3000/api/catalog&amp;gt;',
    API_WRAPPER_URL: '&amp;lt;http://localhost:3000/api/apiWrapper&amp;gt;'
}
export const Api = {
    catalog: catalog(config)
}
File: api/catalog.js

import {redisGetHandler} from "../utils/redis";
import axios from "axios";

export default (config) =&amp;gt; ({
    async getItems(Redis = null, reset = false) {
        return await redisGetHandler(Redis, config.API_URL, 'getItems', reset);
    },
    async getItemsWrapper(reset = false) {
        const {data} = await axios.get(`${config.API_WRAPPER_URL}/getItems?reset=${reset}`);
        return data;
    }
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  8. Setup server API wrapper
&lt;/h2&gt;

&lt;p&gt;This crucial step involves configuring the server API to act as a wrapper for your API endpoint.&lt;/p&gt;

&lt;p&gt;File: pages/api/apiWrapper/getItems.js&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import Redis from "/lib/redis";
import {Api} from "../../../api";

export default async function handler(req, res) {
    const data = await Api.catalog.getItems(Redis, req.query.reset === 'true');
    res.status(200).json(data);
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Another advantage is the ability to conceal your API endpoint URL using this wrapper, which enhances security.&lt;/p&gt;

&lt;h2&gt;
  
  
  9. Change API calls from external to the API wrapper
&lt;/h2&gt;

&lt;p&gt;All you need to do is to change the call from Api.catalog.getItems() to Api.catalog.getItemsWrapper().&lt;/p&gt;

&lt;p&gt;File: pages/index.js&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import {useEffect, useState} from "react";
import {Api} from "../api";

export default function Home({items: serverItems}) {
    const [clientItems, setClientItems] = useState([]);
    const [isClientItemsLoading, setIsClientItemsLoading] = useState(true);
    const getClientItems = async () =&amp;gt; {
        setIsClientItemsLoading(true);
        setClientItems(await Api.catalog.**getItemsWrapper**());
        setIsClientItemsLoading(false);
    };
    useEffect(() =&amp;gt; {
        getClientItems();
    }, []);
    return (
        ....component
    );
}
export async function getServerSideProps({query}) {
    const reset = Boolean(query.reset);
    const items = await Api.catalog.getItemsWrapper(reset);
    return {
        props: {
            items
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Thus, no matter where we request from (server or client) — we always get data from Redis, which means we get it quickly and optimally.&lt;/p&gt;

&lt;p&gt;Repository: &lt;a href="https://github.com/xvandevx/blog-examples/tree/main/nestjs-cache-wrapper" rel="noopener noreferrer"&gt;https://github.com/xvandevx/blog-examples/tree/main/nestjs-cache-wrapper&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Cache resetting
&lt;/h2&gt;

&lt;p&gt;Resetting the cache is a complex topic, which I plan to delve into further in my upcoming article. Its implementation demands a thorough understanding of your project’s mechanics. Simply caching all API endpoints in Redis with a timeout won’t suffice. Certain projects, like online stores where real-time price updates are crucial, cannot afford discrepancies between cached data on detail pages and checkout prices. Even small, low-traffic stores would find such discrepancies disappointing for their customers.&lt;/p&gt;

&lt;p&gt;Therefore, it’s essential to adopt appropriate caching strategies for each part of your application. In this article, I focus on scenarios where immediate cache updates aren’t critical — I simply define a cache lifetime and leave it at that.&lt;/p&gt;

&lt;p&gt;In addition, I recommend implementing manual cache resetting for every project because it is helpful and easy to implement. From time to time you want to see the changed data on the page without waiting for the cache to time out. For this, I recommend using the URL parameter for cache resetting, for example ?reset_cache=y:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export async function getServerSideProps({query}) {
    const reset = Boolean(query.reset);
    const items = await Api.catalog.getItems(Redis, reset);
    return {
        props: {
            items
        }
    };
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Content managers and testers will be very grateful to you for this!&lt;/p&gt;

&lt;p&gt;And that’s it! Now we have cached API which allows us to increase performance and avoid heavy database requests every time you call API.&lt;/p&gt;

&lt;h2&gt;
  
  
  Recap
&lt;/h2&gt;

&lt;p&gt;In this article, I considered the basic use case of Redis with NextJs, which is sufficient for small projects and is more suitable for learning purposes. Large-scale projects typically use more complex systems incorporating Redis, KeyDB, Memcached, HTML caching, CDNs, client caching, and other technologies. It is crucial to choose a set of tools that fit your particular project, at the same time so that it does not increase complexity and development speed.&lt;/p&gt;

&lt;p&gt;If you are in a big company more than likely you don’t need to care about such things as caching and page load speed. More often this is the responsibility of a separate team or it is handled by the backend side or you are doing something where cache is not required at all. Nevertheless, in my point of view, every front-end developer should be aware of how it works and how to deal with it.&lt;/p&gt;

&lt;p&gt;Moreover, in our rapidly changing time when AI comes to our lives, the demand for full-stack developers will start to grow again as it was 15 years ago. That is why knowledge like this will help to point you out from the other developers on the market.&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
