Possible Duplicate:
Is double Multiplication Broken in .NET?
JavaScript code:
var n = 1;
while ( n > 0 )
{
n -= 0.1;
document.body.innerHTML += n + "<br/>";
}
http://jsfiddle.net/UpUE6/4/
I expected:
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
But got this:
0.9
0.8
0.7000000000000001
0.6000000000000001
0.5000000000000001
0.40000000000000013
0.30000000000000016
0.20000000000000015
0.10000000000000014
1.3877787807814457e-16
-0.09999999999999987
Could anyone explain what's going on there?
From The Floating-Point Guide:
Why don’t my numbers, like 0.1 + 0.2 add up to a nice round 0.3, and instead I get a weird result like 0.30000000000000004?
Because internally, computers use a format (binary floating-point) that cannot accurately represent a number like 0.1, 0.2 or 0.3 at all.
When the code is compiled or interpreted, your “0.1” is already rounded to the nearest number in that format, which results in a small rounding error even before the calculation happens.
You're seeing an artifact of the way computers do floating-point arithmetic. See, for example, http://en.wikipedia.org/wiki/Floating_point#Machine_precision
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With