I'm new to working D3.js and html/js in general and I'm trying to figure out how to read in a csv file into separate arrays using d3.csv. I have a csv file that looks like this:
cause, prevalence, disability;
cancer, .3, .4;
aids, .5, .5;
malaria, .2, .1;
For now, I'm struggling with creating arrays for "cause", "prevalence", and "disability. Here is the code that I am using:
<html>
<script type="text/javascript" src="../d3/d3.js"></script>
<script type="text/javascript" src="../d3/d3.csv.js"></script>
<script type="text/javascript">
d3.csv('disability.csv', function(csv){
var cause=csv[0];
var prevalence=csv[1];
var disability=csv[2];
for (i=0;i<cause.length;i++)
{
document.write(cause[i] + "<br />");
}
})
</script>
</html>
The document.write
portion is simply to test that I have read in the data.
It looks like you mean var cause = csv[0]
to refer to the first variable/column, I'm guessing? Without seeing the csv file it's hard to know for sure.
If that's the case, the reason it's not working is javascript arrays unfortunately don't have easy ways of accessing a single variable/column.
csv[0]
instead refers to the first row of your csv. In the case of d3's csv module, it'll be a dictionary with keys for each variable (column name) in the csv, e.g.:
console.log(csv[0]);
{ cause: 'A', prevalence: .1, disability: .5 }
So if you want to create separate arrays for each variable/column, you could do something like this:
var cause = [],
prevalence = [],
disability = [];
csv.map(function(d) {
cause.push(d.cause);
prevalence.push(d.prevalence);
disability.push(d.disability);
}
Then you'd be left with the three arrays it looks like you're trying to make. But in d3 you can probably actually just pass the entire csv
object as the data
to whatever you're creating and then use an accessor function [e.g. .attr('y', function(d) { return d.prevalence; })
] to use a specific column/variable in a particular context.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With