How can I transform a node.js buffer into a Readable stream following using the stream2 interface ?
I already found this answer and the stream-buffers module but this module is based on the stream1 interface.
To consume a readable stream, we can use the pipe / unpipe methods, or the read / unshift / resume methods. To consume a writable stream, we can make it the destination of pipe / unpipe , or just write to it with the write method and call the end method when we're done.
So what is the difference between Stream & Buffer? A buffer has a specified, definite length whereas a stream does not. A stream is a sequence of bytes that is read and/or written to, while a buffer is a sequence of bytes that is stored.
Streams work on a concept called buffer. A buffer is a temporary memory that a stream takes to hold some data until it is consumed. In a stream, the buffer size is decided by the highWatermark property on the stream instance which is a number denoting the size of the buffer in bytes.
The easiest way is probably to create a new PassThrough stream instance, and simply push your data into it. When you pipe it to other streams, the data will be pulled out of the first stream.
var stream = require('stream'); // Initiate the source var bufferStream = new stream.PassThrough(); // Write your buffer bufferStream.end(Buffer.from('Test data.')); // Pipe it to something else (i.e. stdout) bufferStream.pipe(process.stdout)
As natevw suggested, it's even more idiomatic to use a stream.PassThrough
, and end
it with the buffer:
var buffer = new Buffer( 'foo' ); var bufferStream = new stream.PassThrough(); bufferStream.end( buffer ); bufferStream.pipe( process.stdout );
This is also how buffers are converted/piped in vinyl-fs.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With