I was testing some of my code, in javascript I added .1+.2 and it gives me .30000000000000004 instead of .3 . I don't understand this. But when I added .1+.3 it gives me .4. I googled it and find its something about Double Precision addition. But I don't know what it is.
Here's the obligatory link: What Every Computer Scientist Should Know About Floating-Point Arithmetic
Basically, there are many base 10 numbers that cannot be exactly represented in the floating point format used by most computers, so you'll get issues like the ones you highlight.
If you can't stay awake for What Every Computer Scientist Should Know About Floating-Point Arithmetic, try instead the javascript-specific Rounding in JavaScript.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With