Okay so I get that some numbers can't be represented properly in binary just like 1/3 can't be fully represented in decimal.
So how come when I console.log(0.3) it returns 0.3 but when I console.log(0.1 + 0.2) it returns the 0.30000000000000004
How come it is accounting for the error (if it even is) when simply outputting 0.3 but doesn't when the addition occurs?
Suppose we approximate 1/3 and 2/3 in decimal.
1/3 = 0.333
2/3 = 0.667
and we add 1/3+1/3:
1/3+1/3 = 0.333 + 0.333 = 0.666
We didn't get our approximation of 2/3. Rounding 1/3 to something we can represent in decimal didn't produce a number equal to half of what we got when we rounded 2/3.
The same thing happens in binary. We round 0.1 and 0.2 to numbers we can represent in binary, but the sum of the approximations is slightly different from what we get if we approximate 0.3. We get something a bit higher, and the result is displayed as 0.30000000000000004
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With