Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to create streams from string in Node.Js?

People also ask

How do I create a node JS stream?

You can do it with fs module. The function fs. createReadStream() allows you to open up a readable stream and all you have to do is pass the path of the file to start streaming in. If you don't want to create file, you can create an in-memory stream and do something with it (for example, upload it somewhere).

How do I stream data in node JS?

Chaining the Streams Chaining is a mechanism to connect the output of one stream to another stream and create a chain of multiple stream operations. It is normally used with piping operations. Now we'll use piping and chaining to first compress a file and then decompress the same. Verify the Output.

How do I write a stream file in node JS?

To write to a file using streams, you need to create a new writable stream. You can then write data to the stream at intervals, all at once, or according to data availability from a server or other process, then close the stream for good once all the data packets have been written.


As @substack corrected me in #node, the new streams API in Node v10 makes this easier:

const Readable = require('stream').Readable;
const s = new Readable();
s._read = () => {}; // redundant? see update below
s.push('your text here');
s.push(null);

… after which you can freely pipe it or otherwise pass it to your intended consumer.

It's not as clean as the resumer one-liner, but it does avoid the extra dependency.

(Update: in v0.10.26 through v9.2.1 so far, a call to push directly from the REPL prompt will crash with a not implemented exception if you didn't set _read. It won't crash inside a function or a script. If inconsistency makes you nervous, include the noop.)


Do not use Jo Liss's resumer answer. It will work in most cases, but in my case it lost me a good 4 or 5 hours bug finding. There is no need for third party modules to do this.

NEW ANSWER:

var Readable = require('stream').Readable

var s = new Readable()
s.push('beep')    // the string you want
s.push(null)      // indicates end-of-file basically - the end of the stream

This should be a fully compliant Readable stream. See here for more info on how to use streams properly.

OLD ANSWER: Just use the native PassThrough stream:

var stream = require("stream")
var a = new stream.PassThrough()
a.write("your string")
a.end()

a.pipe(process.stdout) // piping will work as normal
/*stream.on('data', function(x) {
   // using the 'data' event works too
   console.log('data '+x)
})*/
/*setTimeout(function() {
   // you can even pipe after the scheduler has had time to do other things
   a.pipe(process.stdout) 
},100)*/

a.on('end', function() {
    console.log('ended') // the end event will be called properly
})

Note that the 'close' event is not emitted (which is not required by the stream interfaces).


From node 10.17, stream.Readable have a from method to easily create streams from any iterable (which includes array literals):

const { Readable } = require("stream")

const readable = Readable.from(["input string"])

readable.on("data", (chunk) => {
  console.log(chunk) // will be called once with `"input string"`
})

Note that at least between 10.17 and 12.3, a string is itself a iterable, so Readable.from("input string") will work, but emit one event per character. Readable.from(["input string"]) will emit one event per item in the array (in this case, one item).

Also note that in later nodes (probably 12.3, since the documentation says the function was changed then), it is no longer necessary to wrap the string in an array.

https://nodejs.org/api/stream.html#stream_stream_readable_from_iterable_options


Just create a new instance of the stream module and customize it according to your needs:

var Stream = require('stream');
var stream = new Stream();

stream.pipe = function(dest) {
  dest.write('your string');
  return dest;
};

stream.pipe(process.stdout); // in this case the terminal, change to ya-csv

or

var Stream = require('stream');
var stream = new Stream();

stream.on('data', function(data) {
  process.stdout.write(data); // change process.stdout to ya-csv
});

stream.emit('data', 'this is my string');

Edit: Garth's answer is probably better.

My old answer text is preserved below.


To convert a string to a stream, you can use a paused through stream:

through().pause().queue('your string').end()

Example:

var through = require('through')

// Create a paused stream and buffer some data into it:
var stream = through().pause().queue('your string').end()

// Pass stream around:
callback(null, stream)

// Now that a consumer has attached, remember to resume the stream:
stream.resume()

There's a module for that: https://www.npmjs.com/package/string-to-stream

var str = require('string-to-stream')
str('hi there').pipe(process.stdout) // => 'hi there'