I know how to find out how many bits are on in a given number (or how many elements are true in a boolean arra), using a mask and bitwise operators, going over all bits checking if they are on. Assuming the number is of arbitrary length, the algorithm runs in O(n) time, where n is the number of bits in the number. Is there an asymptotically better algorithm? I don't think that's possible, but how can I formally prove it?
When looking for the number of bits needed to represent a given number of characters (letters, numbers, or symbols), you need to look at the powers of 2. For example, the reason that 5 bits are required to represent 27 characters is that 2^2=4 (4 is not enough), 2^3 is 8 (still not enough)… 2^5 is 32.
Integer, 16 bit data type is used for numerical tags where variables have the potential for negative or positive values.
Bit Twiddling Hacks presents a number of methods, including this one:
Counting bits set, Brian Kernighan's way
unsigned int v; // count the number of bits set in v unsigned int c; // c accumulates the total bits set in v for (c = 0; v; c++) { v &= v - 1; // clear the least significant bit set }
Brian Kernighan's method goes through as many iterations as there are set bits. So if we have a 32-bit word with only the high bit set, then it will only go once through the loop.
Examples of the algorithm in action:
128 & 127 == 0 10000000 & 01111111 == 00000000
177 & 176 == 176 10110001 & 10110000 == 10110000
176 & 175 == 160 10110000 & 10101111 == 10100000
160 & 159 == 128 10100000 & 10011111 == 10000000
128 & 127 == 0 10000000 & 01111111 == 00000000
255 & 254 == 254 11111111 & 11111110 == 11111110
254 & 253 == 252 11111110 & 11111101 == 11111100
252 & 251 == 248 11111100 & 11111011 == 11111000
248 & 247 == 240 11111000 & 11110111 == 11110000
240 & 239 == 224 11110000 & 11101111 == 11100000
224 & 223 == 192 11100000 & 11011111 == 11000000
192 & 191 == 128 11000000 & 10111111 == 10000000
128 & 127 == 0 10000000 & 01111111 == 00000000
As for the language agnostic question of algorithmic complexity, it is not possible to do better than O(n) where n is the number of bits. Any algorithm must examine all of the bits in a number.
What's tricky about this is when you aren't careful about the definition of n and let n be "the number of bit shifting/masking instructions" or some such. If n is the number of bits then even a simple bit mask (&
) is already an O(n) operation.
So, can this be done in better than O(n) bit tests? No.
Can it be done in fewer than O(n) add/shift/mask operations? Yes.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With