It turns out (outer a bit of thought it's more obvious but whatever) that BigInt recently introduced to javascript has a limit:
My question would be - is there a constant similar to Number.MAX_SAFE_INTEGER
but for BigInt?
This snippet of code:
let a = 2n, step = 1;
try{while(true) {
console.log(step);
a=a**2n; step++
}} catch(e){ console.log(e)}
Shows that the limit is about (step = 32) - at least in Chrome. But I wonder what it this value as per spec.
Under this standard, very large integers that cannot be exactly represented are automatically rounded. To be precise, the Number type in JavaScript can only safely represent integers between -9007199254740991 (-(253-1)) and 9007199254740991 (253-1). Any integer value that falls out of this range may lose precision.
A big integer is a binary integer with a precision of 63 bits. The range of big integers is -9223372036854775808 to +9223372036854775807.
The BigInt. asUintN() method can be useful to stay in the range of 64-bit arithmetic.
It seems like there is no maximum limit to a BigInt as per spec, which makes sense considering BigInts are supposed to be arbitrary-precision integers, whose "digits of precision are limited only by the available memory of the host system".
As for v8 specifically, according to this article on the v8 blog, the precision of BigInts are "arbitrary up to an implementation-defined limit". Unfortunately, I couldn't find any further information on how the limit is determined. Maybe someone else would be able to shed light on this based on these v8 BigInt implementation notes?
That said, based on the aforementioned articles, there doesn't seem to be a specific maximum value/size for a BigInt. Rather, it is likely determined based on the available memory on the system in some way.
The maximum size of a BigInt in webkit is defined as such
// The maximum length that the current implementation supports would be
// maxInt / digitBits. However, we use a lower limit for now, because
// raising it later is easier than lowering it.
// Support up to 1 million bits.
static constexpr unsigned maxLength = 1024 * 1024 / (sizeof(void*) * bitsPerByte);
The size of void* is platform dependent, 8 on 64 bit systems.
So there's your answer right? Should be 16384 bits.... (-1 for the sign). But I can't create anywhere near that large a number in console.
The size has arbitrary precision, per 4.3.25BigInt value, though, oddly not mentioned in the 20.2 BigInt Objects section.
Here is a quick test program:
/* global BigInt */
let b = BigInt(10)
let exp = 1;
while (true) {
console.log(`BigInt of 10^${exp} `);
b = b * b;
exp *= 2;
}
output with Node v13.2:
BigInt of 10^1
BigInt of 10^2
BigInt of 10^4
...
BigInt of 10^4194304
BigInt of 10^8388608
BigInt of 10^16777216
Performance really drags after about 10 to the millionth.
While there may be a platform specific maximum in a specific browser, the size requirement to show it would be large. Even 10^10^6 takes over 300K to store. You could extend the spec to add a limit with Tetration, e.g., "limit is about 10^10^10^... 8 times", but, seriously, that would be silly.
It turns out it's 2^30 - 1
bits, or 2^(2^30) - 1. Any more bits and it wouldn't work. (I was unable to get the actual value, since bit shifts can't do that) To put that in comparison, there are more digits in that number than the population in the US (almost 2x)!
(Wolfram Alpha link for the number of digits)
Tested on a modern Chrome browser.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With