DEV Community

Cover image for Implementing Rate Limiting in NestJS with Redis for Scalable Applications 🚀
Juan Castillo
Juan Castillo

Posted on • Edited on

1

Implementing Rate Limiting in NestJS with Redis for Scalable Applications 🚀

Introduction

Rate limiting is essential to prevent abuse and ensure fair usage of your API. When running multiple instances of your NestJS application, using Redis as a centralized store allows rate limits to be shared across all instances. In this guide, we'll implement rate limiting in NestJS using Redis.

Why Use Redis for Rate Limiting?

When deploying your NestJS application in a distributed environment (multiple instances behind a load balancer), storing rate limit counters in memory won't work consistently. Instead, using Redis provides:

  • Shared state across all instances 🏢
  • High performance for quick lookups
  • Persistence even after app restarts 🔄

Setting Up Redis and Dependencies

First, install Redis and the required dependencies:

npm install redis ioredis rate-limiter-flexible
Enter fullscreen mode Exit fullscreen mode

If you haven't installed Redis yet, you can run it using Docker:

docker run -d --name redis -p 6379:6379 redis
Enter fullscreen mode Exit fullscreen mode

Implementing Rate Limiting in NestJS

1. Create a RateLimitMiddleware

Create a new middleware file:

mkdir src/middleware && touch src/middleware/rate-limit.middleware.ts
Enter fullscreen mode Exit fullscreen mode

Now, implement rate limiting logic using rate-limiter-flexible:

import { Injectable, NestMiddleware, HttpException, HttpStatus } from '@nestjs/common';
import { Request, Response, NextFunction } from 'express';
import { RateLimiterRedis } from 'rate-limiter-flexible';
import Redis from 'ioredis';

const redisClient = new Redis({
  host: 'localhost', // Change if running Redis in a different host
  port: 6379,
  enableOfflineQueue: false,
});

const rateLimiter = new RateLimiterRedis({
  storeClient: redisClient,
  keyPrefix: 'middleware',
  points: 10, // 10 requests
  duration: 60, // per 60 seconds
});

@Injectable()
export class RateLimitMiddleware implements NestMiddleware {
  async use(req: Request, res: Response, next: NextFunction) {
    try {
      await rateLimiter.consume(req.ip); // Identify clients by IP
      next();
    } catch (rejRes) {
      throw new HttpException('Too Many Requests', HttpStatus.TOO_MANY_REQUESTS);
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

2. Apply Middleware Globally

Modify AppModule to register the middleware globally:

import { Module, MiddlewareConsumer, NestModule } from '@nestjs/common';
import { RateLimitMiddleware } from './middleware/rate-limit.middleware';

@Module({})
export class AppModule implements NestModule {
  configure(consumer: MiddlewareConsumer) {
    consumer.apply(RateLimitMiddleware).forRoutes('*');
  }
}
Enter fullscreen mode Exit fullscreen mode

3. Running and Testing 🛠️

Start Redis and your NestJS app:

docker start redis  # If using Docker
npm run start
Enter fullscreen mode Exit fullscreen mode

Then, send multiple requests to test rate limiting:

curl -X GET http://localhost:3000
Enter fullscreen mode Exit fullscreen mode

After 10 requests within 60 seconds, you should get:

{"statusCode":429,"message":"Too Many Requests"}
Enter fullscreen mode Exit fullscreen mode

Conclusion

With Redis and rate-limiter-flexible, we implemented a scalable rate-limiting solution for a distributed NestJS application. This setup ensures consistent request limits across multiple instances, preventing abuse while maintaining high performance. 🚀🔥

Happy coding! 🎉

Heroku

Deploy with ease. Manage efficiently. Scale faster.

Leave the infrastructure headaches to us, while you focus on pushing boundaries, realizing your vision, and making a lasting impression on your users.

Get Started

Top comments (0)

Postmark Image

"Please fix this..."

Focus on creating stellar experiences without email headaches. Postmark's reliable API and detailed analytics make your transactional emails as polished as your product.

Start free

Join the Runner H "AI Agent Prompting" Challenge: $10,000 in Prizes for 20 Winners!

Runner H is the AI agent you can delegate all your boring and repetitive tasks to - an autonomous agent that can use any tools you give it and complete full tasks from a single prompt.

Check out the challenge

DEV is bringing live events to the community. Dismiss if you're not interested. ❤️