Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Multiple Consumption of single stream

I want to know if its possible that multiple functions can consume single stream in node.js. If yes How can this done? Is it possible to pipe to multiple destinations?

I want to use the stream in two different functions which are parallel. I am doing the parallel flow using the async module. So will it possible to say issue the pipe() statement inside each of these functions?

Thanks in advance.

like image 927
Saransh Mohapatra Avatar asked Jul 30 '13 21:07

Saransh Mohapatra


People also ask

What is the difference between single stream and multi-stream recycling?

“Multi-stream” refers to a system where participants have to sort multiple kinds of recycling materials at home, before collection. “Dual-stream” systems require separation of paper/fiber material and containers. “Single stream” recycling is sorted after collection.

What is the difference between single stream and multi-stream?

The ability to show a single data stream or more than one data stream on a given dashboard is determined by the “single” or “multi” stream types: single stream supports only one stream, and multi-stream supports more than one stream. Both types of dashboards display data in real-time.

Is multi-stream recycling better?

In the dual stream recycling MRF, there was a reduced loss of plastics to the paper stream. This suggests that multi-stream recycling is more accurate at diverting recyclables from landfill by ensuring recycled materials are correctly sorted.

What is multi-stream recycling?

Multi-stream recycling is a collection method in which waste generators are required to source separate recyclables in to two (or more) separate bins (generally, paper fibers are placed in one bin and all other containers (plastics, aluminum, etc.) are placed in other bins).


1 Answers

Yes, it's possible, easy and common. The following is a piped data stream from a single source to multiple sources. It shows you the one anonymous callback function that gets placed on the event loop that contains the write function streams that do the actual write work:

var fs  = require('fs');

var rs1 = fs.createReadStream ('input1.txt');                      
var ws1 = fs.createWriteStream('output1.txt');     
var ws2 = fs.createWriteStream('output2.txt');

rs1.on('data', function (data) {                                  
  console.log(data.toString('utf8'));                              
  ws1.write('1: ' + data);                                       
  ws2.write('2: ' + data);                                       
});

An easier way is to use the .pipe() functions.

var fs  = require('fs');

var rs1 = fs.createReadStream ('input1.txt');                      
var ws1 = fs.createWriteStream('output1.txt');     
var ws2 = fs.createWriteStream('output2.txt');

rs1.pipe(ws1);
rs1.pipe(ws2);

The .pipe() allows you to do nifty things like pipeline chaining in the future for pipeline manipulation, very similar to the unix concept of something like du . | sort -rn | less where you can use multiple pipes to handlers.

like image 165
EhevuTov Avatar answered Sep 19 '22 15:09

EhevuTov