What are streams in Node.js?
Node.js Streams
Streams are one of Node.js's most powerful features. They allow you to process data piece by piece (chunk by chunk) rather than loading everything into memory at once β essential for handling large files, network data, or real-time processing.
Why Streams?
js
// WITHOUT streams β loads entire file into memory β
const data = fs.readFileSync('huge-file.mp4'); // 4 GB in RAM!
res.end(data);
// WITH streams β processes chunk by chunk β
fs.createReadStream('huge-file.mp4').pipe(res);Four Types of Streams
| Type | Description | Example |
|---|---|---|
| Readable | Source of data | fs.createReadStream(), http.IncomingMessage |
| Writable | Destination for data | fs.createWriteStream(), http.ServerResponse |
| Duplex | Both readable and writable | net.Socket, zlib.createGzip() |
| Transform | Duplex that modifies data | crypto.createCipher(), zlib |
Readable Stream
js
const fs = require('fs');
const readable = fs.createReadStream('./large-file.txt', {
encoding: 'utf8',
highWaterMark: 64 * 1024 // 64KB chunks
});
readable.on('data', chunk => {
console.log(`Received ${chunk.length} bytes`);
});
readable.on('end', () => console.log('Done reading'));
readable.on('error', err => console.error(err));Writable Stream
js
const fs = require('fs');
const writable = fs.createWriteStream('./output.txt');
writable.write('Hello, ');
writable.write('Streams!');
writable.end('
');
writable.on('finish', () => console.log('Done writing'));pipe() β Connecting Streams
pipe() automatically handles backpressure and connects a readable to a writable:
js
const fs = require('fs');
const zlib = require('zlib');
// Read β Compress β Write (all streaming, low memory)
fs.createReadStream('input.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('input.txt.gz'));pipeline() β Modern approach (Node.js 10+)
js
const { pipeline } = require('stream/promises');
const fs = require('fs');
const zlib = require('zlib');
await pipeline(
fs.createReadStream('input.txt'),
zlib.createGzip(),
fs.createWriteStream('input.txt.gz')
);
// Automatically handles errors and cleanupTransform Stream (Custom)
js
const { Transform } = require('stream');
const upperCaseTransform = new Transform({
transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
});
process.stdin.pipe(upperCaseTransform).pipe(process.stdout);Backpressure
When the consumer is slower than the producer, Node.js automatically pauses reading:
js
const readable = fs.createReadStream('./huge.bin');
const writable = fs.createWriteStream('./copy.bin');
readable.on('data', chunk => {
const canContinue = writable.write(chunk);
if (!canContinue) {
readable.pause(); // stop reading until writable drains
writable.once('drain', () => readable.resume());
}
});pipe() handles backpressure automatically.
Summary
Use streams when:
- Processing large files (video, logs, CSVs)
- Handling HTTP request/response bodies
- Building real-time data pipelines
- Reducing memory footprint of your application
Short Answer
Interview readyPremium
A concise answer to help you respond confidently on this topic during an interview.