Fix TooManyRequestsError: Too Many Requests — rate limit exceeded in Express
This error occurs when a client exceeds the configured request rate limit. The express-rate-limit middleware blocks further requests and returns a 429 status. Fix it by tuning the rate limit window and max values, adding a custom handler with a Retry-After header, and using a persistent store for multi-instance deployments.
Reading the Stack Trace
Here's what each line means:
- at rateLimit (/app/node_modules/express-rate-limit/dist/index.js:72:23): The rate limit middleware determined the client exceeded the maximum number of requests and is generating the 429 response.
- at Layer.handle [as handle_request] (/app/node_modules/express/lib/router/layer.js:95:5): Express is executing the rate limit middleware in the middleware chain before the request reaches any route handler.
- at trim_prefix (/app/node_modules/express/lib/router/index.js:328:13): Express is matching the request path and dispatching through the middleware stack.
Common Causes
1. Rate limit window too small
The windowMs or max values are too restrictive, causing legitimate users to hit the limit during normal usage.
const rateLimit = require('express-rate-limit');
const limiter = rateLimit({
windowMs: 1 * 60 * 1000, // 1 minute
max: 5 // Only 5 requests per minute
});
app.use(limiter);
2. In-memory store with multiple instances
Using the default in-memory store behind a load balancer means each instance tracks its own counter, leading to inconsistent rate limiting.
const rateLimit = require('express-rate-limit');
// Default in-memory store — not shared across cluster workers
const limiter = rateLimit({
windowMs: 15 * 60 * 1000,
max: 100
});
app.use(limiter);
3. No custom handler for rate-limited requests
The default 429 response does not include a helpful message or Retry-After header, leaving clients without guidance on when to retry.
const rateLimit = require('express-rate-limit');
const limiter = rateLimit({
windowMs: 15 * 60 * 1000,
max: 100
// No handler or message configured
});
app.use('/api/', limiter);
The Fix
Increase the rate limit window to 15 minutes with 100 requests, enable standard RateLimit headers, and add a custom handler that returns a clear JSON error with retry timing so clients know when they can resume requests.
const rateLimit = require('express-rate-limit');
const limiter = rateLimit({
windowMs: 1 * 60 * 1000,
max: 5
});
app.use(limiter);
const rateLimit = require('express-rate-limit');
const limiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100, // 100 requests per 15-minute window
standardHeaders: true, // Return RateLimit-* headers
legacyHeaders: false,
handler: (req, res) => {
res.status(429).json({
error: 'Too many requests, please try again later.',
retryAfter: Math.ceil(req.rateLimit.resetTime / 1000)
});
}
});
app.use('/api/', limiter);
Testing the Fix
const request = require('supertest');
const express = require('express');
const rateLimit = require('express-rate-limit');
function createApp(max = 2) {
const app = express();
app.use(rateLimit({
windowMs: 15 * 60 * 1000,
max,
handler: (req, res) => {
res.status(429).json({ error: 'Too many requests, please try again later.' });
}
}));
app.get('/api/data', (req, res) => res.json({ ok: true }));
return app;
}
describe('Rate limiting', () => {
it('allows requests under the limit', async () => {
const res = await request(createApp(2)).get('/api/data');
expect(res.status).toBe(200);
});
it('returns 429 when limit is exceeded', async () => {
const app = createApp(1);
await request(app).get('/api/data');
const res = await request(app).get('/api/data');
expect(res.status).toBe(429);
expect(res.body.error).toBe('Too many requests, please try again later.');
});
});
Run your tests:
npx jest --testPathPattern=rate-limit
Pushing Through CI/CD
git checkout -b fix/express-rate-limit-error,git add src/middleware/rateLimiter.js src/__tests__/rateLimit.test.js,git commit -m "fix: tune rate limit config and add custom 429 handler",git push origin fix/express-rate-limit-error
Your CI config should look something like this:
name: CI
on:
pull_request:
branches: [main]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- run: npm ci
- run: npx jest --coverage
- run: npm run lint
The Full Manual Process: 18 Steps
Here's every step you just went through to fix this one bug:
- Notice the error alert or see it in your monitoring tool
- Open the error dashboard and read the stack trace
- Identify the file and line number from the stack trace
- Open your IDE and navigate to the file
- Read the surrounding code to understand context
- Reproduce the error locally
- Identify the root cause
- Write the fix
- Run the test suite locally
- Fix any failing tests
- Write new tests covering the edge case
- Run the full test suite again
- Create a new git branch
- Commit and push your changes
- Open a pull request
- Wait for code review
- Merge and deploy to production
- Monitor production to confirm the error is resolved
Total time: 30-60 minutes. For one bug.
Or Let bugstack Fix It in Under 2 minutes
Every step above? bugstack does it automatically.
Step 1: Install the SDK
npm install bugstack-sdk
Step 2: Initialize
const { initBugStack } = require('bugstack-sdk')
initBugStack({ apiKey: process.env.BUGSTACK_API_KEY })
Step 3: There is no step 3.
bugstack handles everything from here:
- Captures the stack trace and request context
- Pulls the relevant source files from your GitHub repo
- Analyzes the error and understands the code context
- Generates a minimal, verified fix
- Runs your existing test suite
- Pushes through your CI/CD pipeline
- Deploys to production (or opens a PR for review)
Time from error to fix deployed: Under 2 minutes.
Human involvement: zero.
Try bugstack Free →No credit card. 5-minute setup. Cancel anytime.
Deploying the Fix (Manual Path)
- Run the test suite locally to confirm rate limiting works as expected.
- Open a pull request with the rate limiter configuration changes.
- Wait for CI checks to pass on the PR.
- Have a teammate review and approve the PR.
- Merge to main and monitor 429 response rates in staging before promoting to production.
Frequently Asked Questions
BugStack tests the rate limiter with synthetic traffic, verifies correct 429 responses and Retry-After headers, and confirms legitimate requests still pass before marking it safe.
Every fix is delivered as a pull request with full CI validation. Your team reviews and approves before anything reaches production.
Yes. The default in-memory store does not share state across multiple server instances. Use rate-limit-redis or a similar adapter so all instances enforce a consistent limit per client.
Apply the rate limiter to a specific path prefix like app.use('/api/', limiter) or use the skip option to bypass limiting for health checks and other internal endpoints.