Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use streams to JSON stringify large nested objects in Node.js?

I have a large javascript object that I want to convert to JSON and write to a file. I thought I could do this using streams like so

  var fs = require('fs');
  var JSONStream = require('JSONStream');
  var st = JSONStream.stringifyObject()
             .pipe(fs.createWriteStream('./output_file.js'))

  st.write(large_object);

When I try this I get an error:

stream.js:94
  throw er; // Unhandled stream error in pipe.
        ^
TypeError: Invalid non-string/buffer chunk
    at validChunk (_stream_writable.js:153:14)
    at WriteStream.Writable.write (_stream_writable.js:182:12)

So apparently I cant just write an object to this stringifyObject. I'm not sure what the next step is. I need to convert the object to a buffer? Run the object through some conversion stream and pipe it to strinigfyObject

like image 696
kevzettler Avatar asked Sep 06 '15 20:09

kevzettler


1 Answers

JSONStream doesn't work that way but since your large object is already loaded into memory there is no point to that.

var fs = require('fs-extra')
var file =   '/tmp/this/path/does/not/exist/file.txt'

fs.outputJson(file, {name: 'JP'},   function (err) {
  console.log(err) // => null
});

That will write the JSON.

If you want to use JSONStream you could do something like this:

var fs = require('fs');                          
var jsonStream = require('JSONStream');          

var fl = fs.createWriteStream('dat.json');       

var out = jsonStream.stringifyObject();          
out.pipe(fl);                                    

obj = { test:10, ok: true };                                    
for (key in obj) out.write([key, obj[key]]);                                                                                
out.end();
like image 191
Jason Livesay Avatar answered Sep 22 '22 10:09

Jason Livesay