The descriptions of bitCount() and bitLength() are rather cryptic:
public int bitCount()
Returns the number of bits in the two's complement representation of this BigInteger that differ from its sign bit. This method is useful when implementing bit-vector style sets atop BigIntegers.
Returns: number of bits in the two's complement representation of this BigInteger that differ from its sign bit.
public int bitLength()
Returns the number of bits in the minimal two's-complement representation of this BigInteger, excluding a sign bit. For positive BigIntegers, this is equivalent to the number of bits in the ordinary binary representation. (Computes (ceil(log2(this < 0 ? -this : this+1))).)
Returns: number of bits in the minimal two's-complement representation of this BigInteger, excluding a sign bit.
What is the real difference between these two methods and when should I use which?
I have used bitCount
occasionally to count the number of set bits in a positive integer but I've only rarely use bitLength
and usually when I meant bitCount
because the differences between the descriptions are too subtle for me to instantly grok.
Google Attractor: Java BigInteger bitCount vs bitLength
A big integer is a binary integer that occupies 8 bytes. The range of big integers is -9223372036854775808 to +9223372036854775807.
The BigInteger class stores a number as an array of unsigned, 32-bit integer "digits" with a radix, or base, of 4294967296.
The int data type is the primary integer data type in SQL Server. The bigint data type is intended for use when integer values might exceed the range that is supported by the int data type. bigint fits between smallmoney and int in the data type precedence chart.
A quick demonstration:
public void test() {
BigInteger b = BigInteger.valueOf(0x12345L);
System.out.println("b = " + b.toString(2));
System.out.println("bitCount(b) = " + b.bitCount());
System.out.println("bitLength(b) = " + b.bitLength());
}
prints
b = 10010001101000101
bitCount(b) = 7
bitLength(b) = 17
So, for positive integers:
bitCount()
returns the number of set bits in the number.
bitLength()
returns the position of the highest set bit i.e. the length of the binary representation of the number (i.e. log2).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With