Node.js Streams
Streams are one of the most powerful and fundamental concepts in Node.js. A stream is a way of handling data by processing it piece by piece (in chunks) rather than loading everything into memory at once. This makes streams extremely efficient for working with large amounts of data, such as reading large files, handling HTTP responses, or processing real-time data.
Imagine reading a book. Instead of memorizing the entire book before being able to discuss it, a chapter is read, discussed, then the next chapter is read. Streams work in the same way — data is processed as it arrives, chunk by chunk.
Why Use Streams?
Consider reading a 2 GB video file without streams. The entire file would be loaded into memory before processing begins — which could crash the application on machines with limited RAM. With streams, the file is read and processed in small pieces, keeping memory usage low and performance high.
Types of Streams in Node.js
Node.js has four types of streams:
| Stream Type | Description | Example |
|---|---|---|
| Readable | Data can be read from this stream. | Reading a file (fs.createReadStream) |
| Writable | Data can be written to this stream. | Writing a file (fs.createWriteStream) |
| Duplex | Both readable and writable. | TCP network socket |
| Transform | A duplex stream that modifies data as it passes through. | Compression (zlib.createGzip) |
Readable Streams
Creating a Readable Stream from a File
const fs = require('fs');
const readStream = fs.createReadStream('largeFile.txt', { encoding: 'utf8' });
readStream.on('data', function(chunk) {
console.log("Received chunk:");
console.log(chunk);
});
readStream.on('end', function() {
console.log("Finished reading the file.");
});
readStream.on('error', function(err) {
console.log("Error:", err.message);
});
The 'data' event fires every time a new chunk of data arrives. The 'end' event fires when the stream has no more data to deliver.
Controlling Chunk Size
const readStream = fs.createReadStream('largeFile.txt', {
encoding: 'utf8',
highWaterMark: 64 * 1024 // 64 KB per chunk
});
highWaterMark sets the maximum amount of data read into the buffer at one time.
Writable Streams
Creating a Writable Stream to a File
const fs = require('fs');
const writeStream = fs.createWriteStream('output.txt');
writeStream.write("First line of data.\n");
writeStream.write("Second line of data.\n");
writeStream.write("Third line of data.\n");
writeStream.end(function() {
console.log("Writing complete. File saved.");
});
writeStream.on('error', function(err) {
console.log("Write error:", err.message);
});
The write() method sends a chunk of data. The end() method signals that no more data will be written and closes the stream.
Piping Streams – Connecting Readable to Writable
The most elegant feature of streams is piping. The .pipe() method connects a readable stream directly to a writable stream, automatically managing the flow of data between them.
Example – Copy a File Using Streams
const fs = require('fs');
const readStream = fs.createReadStream('source.txt');
const writeStream = fs.createWriteStream('destination.txt');
readStream.pipe(writeStream);
writeStream.on('finish', function() {
console.log("File copied successfully!");
});
This reads from source.txt and writes directly to destination.txt without loading the entire file into memory. This is far more memory-efficient than reading the whole file first.
Transform Streams – Modifying Data in Transit
A Transform stream modifies or processes data as it flows through. A common use case is compression or encryption.
Example – Compressing a File with zlib
const fs = require('fs');
const zlib = require('zlib');
const readStream = fs.createReadStream('data.txt');
const gzipTransform = zlib.createGzip();
const writeStream = fs.createWriteStream('data.txt.gz');
// Pipe: read → compress → write
readStream.pipe(gzipTransform).pipe(writeStream);
writeStream.on('finish', function() {
console.log("File compressed successfully!");
});
Multiple streams are chained using .pipe(). The compressed data flows from the reader, through the gzip transformer, and into the output file.
Creating a Custom Readable Stream
const { Readable } = require('stream');
class CounterStream extends Readable {
constructor() {
super();
this.count = 1;
}
_read() {
if (this.count <= 5) {
this.push("Number: " + this.count + "\n");
this.count++;
} else {
this.push(null); // Signal end of stream
}
}
}
const counter = new CounterStream();
counter.pipe(process.stdout);
Output:
Number: 1
Number: 2
Number: 3
Number: 4
Number: 5
process.stdout is itself a writable stream — so piping to it prints data directly to the terminal.
Creating a Custom Writable Stream
const { Writable } = require('stream');
class LogWriter extends Writable {
_write(chunk, encoding, callback) {
console.log("[LOG]:", chunk.toString());
callback(); // Signal that writing is done for this chunk
}
}
const logger = new LogWriter();
logger.write("Server started.");
logger.write("User connected.");
logger.end("Goodbye!");
Output:
[LOG]: Server started.
[LOG]: User connected.
[LOG]: Goodbye!
Stream Events Summary
| Stream Type | Event | Fires When |
|---|---|---|
| Readable | data | A chunk of data is available. |
| Readable | end | All data has been read. |
| Readable | error | An error occurred during reading. |
| Writable | drain | The write buffer has emptied and more data can be written. |
| Writable | finish | All data has been flushed to the stream. |
| Writable | error | An error occurred during writing. |
Key Points
- Streams process data in chunks, making them memory-efficient for large files or real-time data.
- The four stream types are: Readable, Writable, Duplex, and Transform.
fs.createReadStream()andfs.createWriteStream()are the most common built-in stream utilities..pipe()connects a readable stream to a writable stream for efficient data transfer.- Multiple streams can be chained with
.pipe()to build powerful data pipelines. - Transform streams allow data to be modified on the fly (e.g., compression, encryption).
- Custom streams are created by extending the built-in
Readable,Writable, orTransformclasses. - Streams are built on top of the
EventEmitter— events likedata,end, andfinishdrive their behavior.
