Fix Error: ERR_STREAM_PREMATURE_CLOSE in Node.js
This error occurs when a readable or writable stream is closed or destroyed before it finished processing all data. Common causes include client disconnections during file downloads, premature stream.destroy() calls, or piping errors. Fix it by handling the 'error' event on all streams and using pipeline() for safe piping.
Reading the Stack Trace
Here's what each line means:
- at Writable.onclose (node:internal/streams/end-of-stream:142:30): The writable stream closed before the readable stream finished sending data, causing the premature close error.
- at emitCloseNT (node:internal/streams/destroy:132:10): Node's stream destruction logic emits the close event, triggering cleanup and error propagation.
- at downloadFile (src/handlers/fileHandler.js:31:12): Your file handler at line 31 uses pipe() without error handling, leaving the pipeline unprotected.
Common Causes
1. Client disconnects during download
The HTTP response stream (writable) is closed when the client disconnects, but the file read stream keeps trying to pipe data.
function downloadFile(req, res) {
const stream = fs.createReadStream('/data/largefile.zip');
stream.pipe(res); // No error handling if client disconnects
}
2. Using pipe() without error handling
stream.pipe() does not forward errors or handle premature closes between the readable and writable streams.
readStream.pipe(transformStream).pipe(writeStream);
// No error listener on any stream
3. Calling destroy() before stream finishes
Manually destroying a stream before it has completed emitting or consuming all data triggers the premature close.
stream.destroy(); // Called in a timeout or condition before data is fully processed
The Fix
Replace stream.pipe() with pipeline() which properly handles errors and cleanup across all streams in the chain. When the client disconnects, pipeline catches the premature close and cleans up the read stream.
const fs = require('fs');
function downloadFile(req, res) {
const stream = fs.createReadStream('/data/largefile.zip');
stream.pipe(res);
}
const fs = require('fs');
const { pipeline } = require('stream');
function downloadFile(req, res) {
const stream = fs.createReadStream('/data/largefile.zip');
res.setHeader('Content-Type', 'application/octet-stream');
pipeline(stream, res, (err) => {
if (err) {
if (err.code === 'ERR_STREAM_PREMATURE_CLOSE') {
console.warn('Client disconnected during download');
} else {
console.error('Stream pipeline error:', err);
}
stream.destroy();
}
});
}
Testing the Fix
const { Readable, Writable } = require('stream');
const { pipeline } = require('stream');
describe('Stream pipeline', () => {
it('handles premature close gracefully', (done) => {
const readable = new Readable({ read() { this.push('data'); } });
const writable = new Writable({
write(chunk, enc, cb) {
writable.destroy(); // Simulate client disconnect
cb();
},
});
pipeline(readable, writable, (err) => {
expect(err).toBeDefined();
expect(err.code).toBe('ERR_STREAM_PREMATURE_CLOSE');
done();
});
});
it('completes successfully when no errors occur', (done) => {
const readable = Readable.from(['hello', 'world']);
const chunks = [];
const writable = new Writable({
write(chunk, enc, cb) { chunks.push(chunk.toString()); cb(); },
});
pipeline(readable, writable, (err) => {
expect(err).toBeUndefined();
expect(chunks).toEqual(['hello', 'world']);
done();
});
});
});
Run your tests:
npm test
Pushing Through CI/CD
git checkout -b fix/nodejs-stream-premature-close,git add src/handlers/fileHandler.js src/handlers/__tests__/fileHandler.test.js,git commit -m "fix: use pipeline() for stream piping to handle premature close",git push origin fix/nodejs-stream-premature-close
Your CI config should look something like this:
name: CI
on:
pull_request:
branches: [main]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- run: npm ci
- run: npm test -- --coverage
- run: npm run lint
The Full Manual Process: 18 Steps
Here's every step you just went through to fix this one bug:
- Notice the error alert or see it in your monitoring tool
- Open the error dashboard and read the stack trace
- Identify the file and line number from the stack trace
- Open your IDE and navigate to the file
- Read the surrounding code to understand context
- Reproduce the error locally
- Identify the root cause
- Write the fix
- Run the test suite locally
- Fix any failing tests
- Write new tests covering the edge case
- Run the full test suite again
- Create a new git branch
- Commit and push your changes
- Open a pull request
- Wait for code review
- Merge and deploy to production
- Monitor production to confirm the error is resolved
Total time: 30-60 minutes. For one bug.
Or Let bugstack Fix It in Under 2 minutes
Every step above? bugstack does it automatically.
Step 1: Install the SDK
npm install bugstack-sdk
Step 2: Initialize
const { initBugStack } = require('bugstack-sdk')
initBugStack({ apiKey: process.env.BUGSTACK_API_KEY })
Step 3: There is no step 3.
bugstack handles everything from here:
- Captures the stack trace and request context
- Pulls the relevant source files from your GitHub repo
- Analyzes the error and understands the code context
- Generates a minimal, verified fix
- Runs your existing test suite
- Pushes through your CI/CD pipeline
- Deploys to production (or opens a PR for review)
Time from error to fix deployed: Under 2 minutes.
Human involvement: zero.
Try bugstack Free →No credit card. 5-minute setup. Cancel anytime.
Deploying the Fix (Manual Path)
- Replace all .pipe() calls with pipeline() across the codebase.
- Add error handlers for each stream pipeline.
- Run tests to verify graceful handling of premature closes.
- Open a pull request and wait for CI.
- Merge and monitor error logs for stream-related issues.
Frequently Asked Questions
BugStack runs the fix through your existing test suite, generates additional edge-case tests, and validates that no other modules are affected before marking it safe to deploy.
BugStack never pushes directly to production. Every fix goes through a pull request with full CI checks, so your team can review it before merging.
pipe() returns the destination stream but does not forward errors or clean up on failure. pipeline() manages the entire chain, destroys all streams on error, and calls a callback when done.
Yes, require('stream/promises').pipeline returns a promise and works well with async/await. It provides the same safety as the callback version with cleaner syntax.