Streams in Node.js
×


Streams in Node.js

249

Streams in Node.js are built to handle continuous flow of data in an efficient way. They allow reading and writing data in chunks, making them ideal for tasks like file handling, network communication, or data transformation.

Why Use Streams?

Rather than loading an entire data file into memory, streams let you process it piece by piece. This approach reduces memory usage and improves performance, especially when dealing with large files or real-time data.

Types of Streams in Node.js

Node.js provides four core stream types, each with its specific role:

  • Readable Streams – Used for reading data.
  • Writable Streams – Used for writing data.
  • Duplex Streams – Used for both reading and writing.
  • Transform Streams – A duplex stream that modifies data while it is being read or written.

Working with Readable Streams

Readable streams are often used to read data from files, HTTP requests, or other sources. Here’s a basic example:


const fs = require('fs');

const readStream = fs.createReadStream('input.txt', 'utf8');

readStream.on('data', chunk => {
    console.log('Chunk received:', chunk);
});
    

Working with Writable Streams

Writable streams are used to send data to a file or output source. Here’s how you can write to a file:


const fs = require('fs');

const writeStream = fs.createWriteStream('output.txt');

writeStream.write('Hello from Node.js Streams!');
writeStream.end();
    

Using pipe() to Connect Streams

The pipe() method lets you connect a readable stream to a writable one. It’s perfect for copying or transferring data from one place to another:


const fs = require('fs');

const readStream = fs.createReadStream('input.txt');
const writeStream = fs.createWriteStream('output.txt');

readStream.pipe(writeStream);
    

Example: File Compression Using Streams

You can use transform streams to compress files on the fly. Here's an example using the zlib module:


const fs = require('fs');
const zlib = require('zlib');

const readStream = fs.createReadStream('file.txt');
const zipStream = zlib.createGzip();
const writeStream = fs.createWriteStream('file.txt.gz');

readStream.pipe(zipStream).pipe(writeStream);
    

Common Use Cases

  • Reading large files without loading them fully in memory
  • Streaming video or audio content
  • Sending and receiving data over a network
  • Transforming data in real-time, like compression or encryption

Conclusion

Streams in Node.js are essential for building scalable and efficient applications. They help manage data flow with minimal resource consumption, making them ideal for working with files, APIs, and real-time data processing.



If you’re passionate about building a successful blogging website, check out this helpful guide at Coding Tag – How to Start a Successful Blog. It offers practical steps and expert tips to kickstart your blogging journey!

For dedicated UPSC exam preparation, we highly recommend visiting www.iasmania.com. It offers well-structured resources, current affairs, and subject-wise notes tailored specifically for aspirants. Start your journey today!


Best WordPress Hosting


Share:


Discount Coupons

Get a .COM for just $6.98

Secure Domain for a Mini Price



Leave a Reply


Comments
    Waiting for your comments

Coding Tag WhatsApp Chat