What are Streams in Node.js?
0 440
What are Streams in Node.js?
Streams in Node.js are built-in modules designed to handle continuous chunks of data. Instead of reading or writing large files at once, streams allow data to be processed in smaller pieces, improving efficiency and performance — especially with large files or network operations.
Why Use Streams?
Streams are memory-efficient and faster because they work with data as it becomes available. This approach avoids loading entire files into memory and is ideal for real-time applications like video streaming, file uploads, or reading logs.
Types of Streams in Node.js
- Readable Streams: Allow data to be read in chunks. Example:
fs.createReadStream(). - Writable Streams: Allow writing data in parts. Example:
fs.createWriteStream(). - Duplex Streams: Both readable and writable. Example: TCP sockets.
- Transform Streams: Modify data during the read/write process. Example: compression streams.
Basic Example: Reading a File Using Streams
const fs = require('fs');
// Create a readable stream
const readStream = fs.createReadStream('example.txt', 'utf8');
// Read data in chunks
readStream.on('data', (chunk) => {
console.log('Received chunk:', chunk);
});
readStream.on('end', () => {
console.log('Finished reading file.');
});
In this example, Node.js reads the file piece by piece and processes each chunk, rather than loading the full file at once.
Writing Data Using Writable Streams
const fs = require('fs');
// Create a writable stream
const writeStream = fs.createWriteStream('output.txt');
// Write data in parts
writeStream.write('Hello, ');
writeStream.write('this is written using a stream.\n');
writeStream.end('End of file.');
Writable streams allow you to add data gradually, which is useful when you're receiving data in parts (like from an API or a user form).
Piping Streams Together
One of the best features of streams is the pipe() method, which allows one stream to feed directly into another — for example, reading from a file and writing its contents to another.
const fs = require('fs');
const readStream = fs.createReadStream('input.txt');
const writeStream = fs.createWriteStream('copy.txt');
// Pipe the readable stream into the writable stream
readStream.pipe(writeStream);
This example reads data from input.txt and writes it directly to copy.txt without needing manual handling of data chunks.
Handling Stream Events
Streams emit several events like data, end, error, and close. Listening to these events ensures smooth data handling and proper error management.
readStream.on('error', (err) => {
console.error('An error occurred:', err.message);
});
When to Use Streams
Streams are ideal when working with:
- Large files (read/write without memory overload)
- Real-time data processing (e.g., live video or audio)
- Network operations like downloading/uploading files
- API integrations that return partial data over time
Conclusion
Streams in Node.js offer a powerful and efficient way to handle data. By working with chunks, streams reduce memory usage and improve performance — especially in applications dealing with large files or real-time data. Mastering streams is a key skill for any serious Node.js developer.
If you’re passionate about building a successful blogging website, check out this helpful guide at Coding Tag – How to Start a Successful Blog. It offers practical steps and expert tips to kickstart your blogging journey!
For dedicated UPSC exam preparation, we highly recommend visiting www.iasmania.com. It offers well-structured resources, current affairs, and subject-wise notes tailored specifically for aspirants. Start your journey today!
Share:


Comments
Waiting for your comments