I faced problem and I can't quite explain it. Actually I'm quite surprised. When I try to increment the number 9933272057275866 by 1, it automatically adds 2!!! Please see following code:
let test = 9933272057275866;
let test2 = test+1;
console.log('Before:', test);
console.log('After:', test2);
console.log('Sub:', test2-test);
And respective output:
Before: 9933272057275866
After: 9933272057275868
Sub: 2
How can this be possible?
Environment is Javascript. I found this problem when I submitted a challenge at Hackerrank, then I also tried to do the same in my own environment on node.js. Same result!
What is happening?
Basically, it's because 64 bits (*) aren't enough to represent the number accurately.
4341 A521 1037 32ED
: 9.933272057275866 ⨉ 1015
4341 A521 1037 32EE
: 9.933272057275868 ⨉ 1015
See how it skips one integer between them. In IEEE 754, the higher you go, the more the numbers are spread out along the number line. One thing to keep in mind is that floating point numbers are approximations. This is why you get this result:
0.1 + 0.2 === 0.3 // false
The max safe integer in IEEE 754 is 9007199254740991.
More fun facts with floating point arithmetics:
A + B == B + A // true, commutative, except when A or B is NaN
(A + B) + C == A + (B + C) // false, not associative
* It is worth mentioning that numbers in JavaScript (ECMAScript) are represented as 64 bit IEEE 754 doubles. ref
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With