I'm using fast-csv's fromPath()
method to read data from a file. I would like to write this data into an array (which I will subsequently sort). I would expect the code below to work for this purpose, but it does not:
var csv = require('fast-csv');
var dataArr = [];
csv.fromPath("datas.csv", {headers: true})
.on("data", data => {
console.log(data);
// > { num: '4319', year: '1997', month: '4', day: '20', ...
dataArr.push(data);
});
console.log(dataArr);
// > []
I am able to read the data in the file with this code, but the array is not populated.
What is a good way to accomplish this, and why does the code above not work?
We'll use fopen() and fgetcsv() to read the contents of a CSV file, and then we'll convert it into an array using the array_map() and str_getcsv() functions.
You can do open("data. csv", "rw") , this allows you to read and write at the same time. So will this help me modify the data?
Well, I know that this question has been asked a long back but just now I got to work with CSV file for creating API with node js. Being a typical programmer I googled "Reading from a file with fast-csv and writing into an array" well something like this but till date, there isn't any proper response for the question hence I decided to answer this.
Well on is async function and hence execution will be paused in main flow and will be resumed only after nonasync function gets executed.
var queryParameter = ()=> new Promise( resolve =>{
let returnLit = []
csv.fromPath("<fileName>", {headers : true})
.on('data',(data)=>{
returnLit.push(data[<header name>].trim())
})
.on('end',()=>{
resolve(returnLit)
})
})
var mainList = [];
queryParameter().then((res)=>mainList = res)
If you want to validate something pass argument into queryParameter()
and uses the argument in validate method.
The "on data" callback is asynchronous, and the commands that follow the callback will run before the callback finishes. This is why the code does not work, and this reasoning has been pointed out by others who have posted answers and comments.
As for a good way to accomplish the task, I have found that using the "on end" callback is a good fit; since the intention here is to "do something" with the whole data, after the file has been read completely.
var dataArr = [];
csv.fromPath("datas.csv", {headers: true})
.on("data", data => {
dataArr.push(data);
})
.on("end", () => {
console.log(dataArr.length);
// > 4187
});
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With