How It Works Features Pricing Blog Error Guides
Log In Start Free Trial
Express · JavaScript

Fix TooManyRequestsError: Too Many Requests — rate limit exceeded in Express

This error occurs when a client exceeds the configured request rate limit. The express-rate-limit middleware blocks further requests and returns a 429 status. Fix it by tuning the rate limit window and max values, adding a custom handler with a Retry-After header, and using a persistent store for multi-instance deployments.

Reading the Stack Trace

HTTP/1.1 429 Too Many Requests X-RateLimit-Limit: 100 X-RateLimit-Remaining: 0 X-RateLimit-Reset: 1700000060 Retry-After: 60 Error: Too Many Requests at rateLimit (/app/node_modules/express-rate-limit/dist/index.js:72:23) at Layer.handle [as handle_request] (/app/node_modules/express/lib/router/layer.js:95:5) at trim_prefix (/app/node_modules/express/lib/router/index.js:328:13) at /app/node_modules/express/lib/router/index.js:284:15 at Function.process_params (/app/node_modules/express/lib/router/index.js:346:12) at next (/app/node_modules/express/lib/router/index.js:280:10) at expressInit (/app/node_modules/express/lib/middleware/init.js:40:5) at Layer.handle [as handle_request] (/app/node_modules/express/lib/router/layer.js:95:5)

Here's what each line means:

Common Causes

1. Rate limit window too small

The windowMs or max values are too restrictive, causing legitimate users to hit the limit during normal usage.

const rateLimit = require('express-rate-limit');

const limiter = rateLimit({
  windowMs: 1 * 60 * 1000, // 1 minute
  max: 5 // Only 5 requests per minute
});

app.use(limiter);

2. In-memory store with multiple instances

Using the default in-memory store behind a load balancer means each instance tracks its own counter, leading to inconsistent rate limiting.

const rateLimit = require('express-rate-limit');

// Default in-memory store — not shared across cluster workers
const limiter = rateLimit({
  windowMs: 15 * 60 * 1000,
  max: 100
});

app.use(limiter);

3. No custom handler for rate-limited requests

The default 429 response does not include a helpful message or Retry-After header, leaving clients without guidance on when to retry.

const rateLimit = require('express-rate-limit');

const limiter = rateLimit({
  windowMs: 15 * 60 * 1000,
  max: 100
  // No handler or message configured
});

app.use('/api/', limiter);

The Fix

Increase the rate limit window to 15 minutes with 100 requests, enable standard RateLimit headers, and add a custom handler that returns a clear JSON error with retry timing so clients know when they can resume requests.

Before (broken)
const rateLimit = require('express-rate-limit');

const limiter = rateLimit({
  windowMs: 1 * 60 * 1000,
  max: 5
});

app.use(limiter);
After (fixed)
const rateLimit = require('express-rate-limit');

const limiter = rateLimit({
  windowMs: 15 * 60 * 1000, // 15 minutes
  max: 100, // 100 requests per 15-minute window
  standardHeaders: true, // Return RateLimit-* headers
  legacyHeaders: false,
  handler: (req, res) => {
    res.status(429).json({
      error: 'Too many requests, please try again later.',
      retryAfter: Math.ceil(req.rateLimit.resetTime / 1000)
    });
  }
});

app.use('/api/', limiter);

Testing the Fix

const request = require('supertest');
const express = require('express');
const rateLimit = require('express-rate-limit');

function createApp(max = 2) {
  const app = express();
  app.use(rateLimit({
    windowMs: 15 * 60 * 1000,
    max,
    handler: (req, res) => {
      res.status(429).json({ error: 'Too many requests, please try again later.' });
    }
  }));
  app.get('/api/data', (req, res) => res.json({ ok: true }));
  return app;
}

describe('Rate limiting', () => {
  it('allows requests under the limit', async () => {
    const res = await request(createApp(2)).get('/api/data');
    expect(res.status).toBe(200);
  });

  it('returns 429 when limit is exceeded', async () => {
    const app = createApp(1);
    await request(app).get('/api/data');
    const res = await request(app).get('/api/data');
    expect(res.status).toBe(429);
    expect(res.body.error).toBe('Too many requests, please try again later.');
  });
});

Run your tests:

npx jest --testPathPattern=rate-limit

Pushing Through CI/CD

git checkout -b fix/express-rate-limit-error,git add src/middleware/rateLimiter.js src/__tests__/rateLimit.test.js,git commit -m "fix: tune rate limit config and add custom 429 handler",git push origin fix/express-rate-limit-error

Your CI config should look something like this:

name: CI
on:
  pull_request:
    branches: [main]
jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with:
          node-version: '20'
          cache: 'npm'
      - run: npm ci
      - run: npx jest --coverage
      - run: npm run lint

The Full Manual Process: 18 Steps

Here's every step you just went through to fix this one bug:

  1. Notice the error alert or see it in your monitoring tool
  2. Open the error dashboard and read the stack trace
  3. Identify the file and line number from the stack trace
  4. Open your IDE and navigate to the file
  5. Read the surrounding code to understand context
  6. Reproduce the error locally
  7. Identify the root cause
  8. Write the fix
  9. Run the test suite locally
  10. Fix any failing tests
  11. Write new tests covering the edge case
  12. Run the full test suite again
  13. Create a new git branch
  14. Commit and push your changes
  15. Open a pull request
  16. Wait for code review
  17. Merge and deploy to production
  18. Monitor production to confirm the error is resolved

Total time: 30-60 minutes. For one bug.

Or Let bugstack Fix It in Under 2 minutes

Every step above? bugstack does it automatically.

Step 1: Install the SDK

npm install bugstack-sdk

Step 2: Initialize

const { initBugStack } = require('bugstack-sdk')

initBugStack({ apiKey: process.env.BUGSTACK_API_KEY })

Step 3: There is no step 3.

bugstack handles everything from here:

  1. Captures the stack trace and request context
  2. Pulls the relevant source files from your GitHub repo
  3. Analyzes the error and understands the code context
  4. Generates a minimal, verified fix
  5. Runs your existing test suite
  6. Pushes through your CI/CD pipeline
  7. Deploys to production (or opens a PR for review)

Time from error to fix deployed: Under 2 minutes.

Human involvement: zero.

Try bugstack Free →

No credit card. 5-minute setup. Cancel anytime.

Deploying the Fix (Manual Path)

  1. Run the test suite locally to confirm rate limiting works as expected.
  2. Open a pull request with the rate limiter configuration changes.
  3. Wait for CI checks to pass on the PR.
  4. Have a teammate review and approve the PR.
  5. Merge to main and monitor 429 response rates in staging before promoting to production.

Frequently Asked Questions

BugStack tests the rate limiter with synthetic traffic, verifies correct 429 responses and Retry-After headers, and confirms legitimate requests still pass before marking it safe.

Every fix is delivered as a pull request with full CI validation. Your team reviews and approves before anything reaches production.

Yes. The default in-memory store does not share state across multiple server instances. Use rate-limit-redis or a similar adapter so all instances enforce a consistent limit per client.

Apply the rate limiter to a specific path prefix like app.use('/api/', limiter) or use the skip option to bypass limiting for health checks and other internal endpoints.