Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Synchronous processing of a csv file using fast-csv

Tags:

node.js

csv

I am trying to process a csv file using fast-csv and here is my code.

    var stream = fs.createReadStream("sample.csv");
    csv.fromStream(stream, {headers : true})
    .on("data", function(data) {
            console.log('here');
            module.exports.saveData(data, callback)
        })
    .on("end", function(){
        console.log('end of saving file');
    });

    module.exports.saveData = function(data) {
    console.log('inside saving')
    }

The problem that I am facing is the process is not synchronous. The output that I am seeing is something like

here
here
inside saving
inside saving

But, What I want is

here
inside saving
here
inside saving

I am assuming we need to use async.series or async.eachSeries but not exactly sure how to use that here. Any inputs are greatly appreciated

Thanks in Advance!

like image 241
Gayathri Yanamandra Avatar asked Jul 14 '16 10:07

Gayathri Yanamandra


1 Answers

You can pause the parser, wait for saveData to complete, and subsequently resume the parser:

var parser = csv.fromStream(stream, {headers : true}).on("data", function(data) {
  console.log('here');
  parser.pause();
  module.exports.saveData(data, function(err) {
    // TODO: handle error
    parser.resume();
  });
}).on("end", function(){
  console.log('end of saving file');
});

module.exports.saveData = function(data, callback) {
  console.log('inside saving')
  // Simulate an asynchronous operation:
  process.setImmediate(callback);
}
like image 160
robertklep Avatar answered Nov 16 '22 00:11

robertklep