I've the following data for the stack barchart. When I try to calculate the max value for y-axis, it's giving me 100. It should be 70. Can someone let me know where am I doing wrong?
The snippet follows as below
//DATA
var data = [{
category: "test1",
type1: 10,
type2: 20
},
{
category: "test2",
type1: 30,
type2: 40
}
];
var keys = ["type1", "type2"];
var yScale = d3.scaleLinear()
.range([400, 0]);
var stack = d3
.stack()
.keys(keys)
.order(d3.stackOrderNone)
.offset(d3.stackOffsetNone);
var layers = stack(data);
console.log(d3.max(layers[layers.length - 1], d => (d[0] + d[1])))
<script src="https://cdnjs.cloudflare.com/ajax/libs/d3/5.7.0/d3.min.js"></script>
The result is correct, you are finding the max of: (10 + 20) and (30 + 70) which is the latter, 100.
The stack represents each data point with two values:
Lastly, each point is represented as an array [y0, y1] where y0 is the lower value (baseline) and y1 is the upper value (topline); the difference between y0 and y1 corresponds to the computed value for this point. (docs)
Given your inputs, the first category has datapoints for your types of [0,10] and [10,30] - the second point's baseline is equal to the first point's topline (baseline + value). The second category has datapoints for your types [0,30] and [30,70].
In your max function you are adding d[0] and d[1], which reduces each of the two element arrays above to a single sum:
// for each item in the array layers[layers.length-1], find the max of d[0] + d[1]
d3.max(layers[layers.length - 1], d => (d[0] + d[1]))
This will give you 100 as the array layers[layers.length-1] (representing the second category's parts) is:
[
[ 0 , 30 ],
[ 30 , 70 ]
]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With