Was playing around with JS and noticed this.
Why does parseInt(1e+21)
return 1
and parseInt(1e+20)
return 100000000000000000000
? parseInt(10e+20)
also returns 1
?
Why does parseInt(1.7976931348623157E+10308);
return NaN
while parseFloat(1.7976931348623157E+10308)
returns Infinity
?
Even made a fibonacci sequence and any value that has +21
exponents only returns the first digit:
parseFloat()
will return the correct number till Infinity.
Fibonacci demo (int : parseInt() : parseFloat())
( tested in Chrome )
parseInt
and parseFloat
are supposed to parse strings and convert them to Number
s. So, when you pass them a Number
, it gets coerced to a string by the interpreter. So, at least in Firefox, "" + 1e+20
outputs "100000000000000000000"
, and "" + 1e+21
outputs "1e+21"
.
Let's read the documentation of parseInt
:
If parseInt encounters a character that is not a numeral in the specified radix, it ignores it and all succeeding characters and returns the integer value parsed up to that point
so parseInt("1.whatever")
returns 1
and parseInt("32.231e+something")
returns 32.
A similar thing must be happening with parseFloat
.
It's the weak typing's fault. IMO the correct behaviour when doing what you do would be to throw an exception.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With