Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Parsing a CSV file using NodeJS

Tags:

node.js

csv

Seems like you need to use a stream-based library such as fast-csv, which also includes validation support.


I used this way:-

var fs = require('fs'); 
var parse = require('csv-parse');

var csvData=[];
fs.createReadStream(req.file.path)
    .pipe(parse({delimiter: ':'}))
    .on('data', function(csvrow) {
        console.log(csvrow);
        //do something with csvrow
        csvData.push(csvrow);        
    })
    .on('end',function() {
      //do something with csvData
      console.log(csvData);
    });

My current solution uses the async module to execute in series:

var fs = require('fs');
var parse = require('csv-parse');
var async = require('async');

var inputFile='myfile.csv';

var parser = parse({delimiter: ','}, function (err, data) {
  async.eachSeries(data, function (line, callback) {
    // do something with the line
    doSomething(line).then(function() {
      // when processing finishes invoke the callback to move to the next one
      callback();
    });
  })
});
fs.createReadStream(inputFile).pipe(parser);

  • This solution uses csv-parser instead of csv-parse used in some of the answers above.
  • csv-parser came around 2 years after csv-parse.
  • Both of them solve the same purpose, but personally I have found csv-parser better, as it is easy to handle headers through it.

Install the csv-parser first:

npm install csv-parser

So suppose you have a csv-file like this:

NAME, AGE
Lionel Messi, 31
Andres Iniesta, 34

You can perform the required operation as:

const fs = require('fs'); 
const csv = require('csv-parser');

fs.createReadStream(inputFilePath)
.pipe(csv())
.on('data', function(data){
    try {
        console.log("Name is: "+data.NAME);
        console.log("Age is: "+data.AGE);

        //perform the operation
    }
    catch(err) {
        //error handler
    }
})
.on('end',function(){
    //some final operation
});  

For further reading refer