Streams are one of the most powerful features in Node.js. They allow you to handle large amounts of data efficiently by processing it in small pieces (chunks) instead of loading it all into memory at once. This is crucial for performance and scalability.
Types of Streams
- Readable Streams: For reading data from a source (e.g., `fs.createReadStream`).
- Writable Streams: For writing data to a destination (e.g., `fs.createWriteStream`).
- Duplex Streams: Both readable and writable (e.g., a TCP socket).
- Transform Streams: A type of duplex stream that can modify or transform the data as it passes through (e.g., a compression stream).
The Power of `pipe()`
The `.pipe()` method is the easiest way to work with streams. It takes a readable stream and connects it to a writable stream, automatically handling the flow of data, backpressure (when the writable stream is slower than the readable one), and errors.
Example: Piping a Large File
This example reads a large video file and sends it to a client over HTTP without ever loading the entire file into memory.
const fs = require('fs');
const http = require('http');
const server = http.createServer((req, res) => {
// Create a readable stream from the source file
const readableStream = fs.createReadStream('large-video.mp4');
// The response object (res) is a writable stream
// Pipe the file data directly to the client
readableStream.pipe(res);
});
server.listen(3000, () => {
console.log('Server listening on port 3000');
});
Example: Compressing a File with a Transform Stream
const fs = require('fs');
const zlib = require('zlib'); // The compression library
const readableStream = fs.createReadStream('large-file.txt');
const writableStream = fs.createWriteStream('large-file.txt.gz');
// Create a transform stream for gzip compression
const gzip = zlib.createGzip();
// Pipe the chain: Read -> Compress -> Write
readableStream.pipe(gzip).pipe(writableStream);
console.log('File compressed!');
Comments
Post a Comment