I have the following code:
var str = "0x4000000000000000"; //4611686018427387904 decimal var val = parseInt(str); alert(val);
I get this value: "4611686018427388000
", which is 0x4000000000000060
I was wondering if JavaScript is mishandling 64-bit integers or am I doing something wrong?
JavaScript represents numbers using IEEE-754 double-precision (64 bit) format.
The JavaScript Number type is a double-precision 64-bit binary format IEEE 754 value, like double in Java or C#.
In Java, the long data type stores integer on 64 bit while the integer data type stores integer on 32bit.
> JSON doesn't have 64-bit integers. The JSON "number" type has no limitations on the range. All 64-bit integers, both signed and unsigned, can be encoded as JSON numbers.
JavaScript represents numbers using IEEE-754 double-precision (64 bit) format. As I understand it this gives you 53 bits precision, or fifteen to sixteen decimal digits. Your number has more digits than JavaScript can cope with, so you end up with an approximation.
This isn't really "mishandling" as such, but obviously it isn't very helpful if you need full precision on large numbers. There are a few JS libraries around that can handle larger numbers, e.g., BigNumber and Int64.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With