Next.js API Rate Limiting with Middleware

Next.js API Rate Limiting with Middleware

·

4 min read

As your application scales, managing API usage becomes crucial to maintain performance and security. Rate limiting is a key strategy to control the frequency of API requests, prevent abuse, and ensure fair usage among clients. In this guide, we’ll explore how to implement API rate limiting in a Next.js application using middleware.


What Is Rate Limiting?

Rate limiting restricts the number of API requests a client can make within a specific time window. For example, you might allow a maximum of 100 requests per user per minute. If the limit is exceeded, the server responds with an error or throttles further requests.

Benefits of Rate Limiting

  • Improved Performance: Prevents server overload by controlling traffic.

  • Security: Mitigates denial-of-service (DoS) attacks and other abuse.

  • Fair Usage: Ensures all clients have equitable access to resources.


Setting Up Middleware for Rate Limiting in Next.js

Starting from Next.js 13, middleware provides an efficient way to process requests before they reach API routes or pages. We’ll leverage middleware to implement rate limiting.

Step 1: Install Dependencies

We’ll use the upstash/ratelimit package along with Upstash Redis for rate limiting. Install the required packages:

npm install @upstash/redis @upstash/ratelimit

Step 2: Configure Upstash Redis

  1. Create an account on Upstash.

  2. Set up a new Redis database and note the connection URL and token.

Step 3: Implement Middleware

Create a middleware.ts file in the src directory to handle rate limiting.

// src/middleware.ts
import { NextResponse } from 'next/server';
import { Redis } from '@upstash/redis';
import { Ratelimit } from '@upstash/ratelimit';

// Initialize Redis client
const redis = new Redis({
  url: process.env.UPSTASH_REDIS_URL!,
  token: process.env.UPSTASH_REDIS_TOKEN!,
});

// Initialize Ratelimit with Redis store
const ratelimit = new Ratelimit({
  redis,
  limiter: Ratelimit.fixedWindow(100, '1 m'), // 100 requests per minute
});

export default async function middleware(req: Request) {
  const ip = req.headers.get('x-forwarded-for') || req.ip || 'unknown';

  // Check rate limit for the IP address
  const { success, reset } = await ratelimit.limit(ip);

  if (!success) {
    return NextResponse.json(
      { error: 'Too many requests. Please try again later.' },
      { status: 429, headers: { 'Retry-After': String(reset) } }
    );
  }

  return NextResponse.next();
}

export const config = {
  matcher: '/api/:path*', // Apply middleware only to API routes
};

How It Works:

  1. IP-Based Rate Limiting: The middleware extracts the client’s IP address and checks its request count using the ratelimit instance.

  2. Fixed Window: The rate limiter allows a maximum of 100 requests per IP per minute.

  3. Response on Exceeding Limit: If the client exceeds the limit, a 429 Too Many Requests response is returned, with a Retry-After header indicating when they can retry.

Step 4: Add Environment Variables

Add the following environment variables to your .env.local file:

UPSTASH_REDIS_URL=<your-upstash-redis-url>
UPSTASH_REDIS_TOKEN=<your-upstash-redis-token>

Testing the Middleware

  1. Start the Server: Run the development server:

     npm run dev
    
  2. Send API Requests: Use a tool like curl or Postman to send requests to an API route (e.g., /api/example).

  3. Check Rate Limiting: After exceeding 100 requests per minute, you should receive a 429 Too Many Requests error.

curl -X GET http://localhost:3000/api/example

Enhancing the Middleware

User-Specific Rate Limiting

Instead of IP-based rate limiting, you can use a user’s authentication token or unique identifier.

const userId = req.headers.get('authorization') || 'unknown';
const { success, reset } = await ratelimit.limit(userId);

Different Limits for Different Routes

You can customize rate limits for specific routes by modifying the middleware’s matcher configuration or checking the request URL.

if (req.nextUrl.pathname.startsWith('/api/private')) {
  // Stricter rate limit for private APIs
  const { success, reset } = await ratelimit.limit(ip, { requests: 50 });
  // ...
}

Best Practices

  1. Monitor Usage: Regularly review rate limit metrics to adjust limits as needed.

  2. Graceful Messaging: Provide informative error messages and retry headers.

  3. Test Thoroughly: Simulate high traffic to ensure the middleware handles edge cases.


Conclusion

Implementing API rate limiting in Next.js with middleware ensures secure and scalable API usage. By leveraging Upstash Redis and the @upstash/ratelimit package, you can effectively manage traffic and protect your application from abuse. Whether you’re building public APIs or internal tools, rate limiting is a vital strategy for maintaining performance and fairness.