Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Best way to pipe data from one Input Stream to multiple Output Streams

A PipedInputStream/PipedOutputStream connection works great when the data only needs to be piped to one output, but if multiple output streams are connected to one input stream, the data becomes fragmented across the different outputs. My current solution involves having a threaded "reader" that reads data from an InputStream and then writes the data to OutputStream objects that have been associated with the reader. This seems to work ok, but it seems messy and inefficient by comparison to the native PipedIO classes.

Is there a better way to handle this or is the implementation that I'm working with about as good as I'm going to get?

like image 610
WeeTodd Avatar asked Jun 28 '10 19:06

WeeTodd


1 Answers

If one input stream has to be read by multiple consumers, and the input stream is ephemeral (i.e. is not a resource that can be 'rewound', or supports multiple input pointers) you will generally have to provide a buffering scheme that behaves as if it retains each data item until all consumers have read it.

You have several choices on implementation. The simplest is what you suggest, with the overhead being primarily storage space for multiple copies of the data in the output buffers. If storage is an issue you could provide a single buffer that maintains separate read pointers, one for each consumer, and keeps in memory only the data between the lowest and highest read pointers. If the consumers read data at greatly different speeds you could still end up with most or all of the input data in memory, at which point some kind of input throttling, or intermediate disk buffering scheme, would become necessary.

I assume the single input stream is not persistent (i.e. a file on disk)... in that case the solution is trivial.

like image 58
Jim Garrison Avatar answered Oct 16 '22 19:10

Jim Garrison