Does using bitwise operations in normal flow or conditional statements like for
, if
, and so on increase overall performance and would it be better to use them where possible? For example:
if(i++ & 1) { }
vs.
if(i % 2) { }
In the multiplication case, the normal version actually performs about 20% faster than the bitwise equivalent. On the other hand, division is nearly twice as fast with the bitwise shift and the bitwise modulus (really just an & ) is more than three times faster!
On simple low-cost processors, typically, bitwise operations are substantially faster than division, several times faster than multiplication, and sometimes significantly faster than addition.
Bitwise operations are incredibly simple and thus usually faster than arithmetic operations. For example to get the green portion of an rgb value, the arithmetic approach is (rgb / 256) % 256 . With bitwise operations you would do something as (rgb >> 8) & 0xFF .
On most older microprocessors, bitwise operations are slightly faster than addition and subtraction operations and usually significantly faster than multiplication and division operations.
Unless you're using an ancient compiler, it can already handle this level of conversion on its own. That is to say, a modern compiler can and will implement i % 2
using a bitwise AND
instruction, provided it makes sense to do so on the target CPU (which, in fairness, it usually will).
In other words, don't expect to see any difference in performance between these, at least with a reasonably modern compiler with a reasonably competent optimizer. In this case, "reasonably" has a pretty broad definition too--even quite a few compilers that are decades old can handle this sort of micro-optimization with no difficulty at all.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With