The following script contains a very strange error. I want to check if a value is a positive integer. To do this, I multiply by 100 to enclose the value to decimal. If I test 0.07, the script does not calculated the value 7, but the value 7.00000001. I could round the value, but I would like to know why the value is calculated this way.
<script type="text/javascript">
var isPositiveInt = function(i) {
i = i*100;
return ((i % 1) == 0 && i >= 0);
};
</script>
<a href="#" onclick="alert(isPositiveInt('0.07')); return false;">Try it out!</a>
0.05, 0.06 and 0.08 works fine. But what happens with 0.07? I would be happy if someone could explain that to me.
That's because javascript casts everything to a double internally. As a result, all calculations pick up some noise due to floating point inaccuracy: Floating point inaccuracy examples
One way to fix this issue, is to just round down to the nearest int after all intermediate calculations.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With