I just noticed that bitwise operations aren't as "smart" as logical "and\or" operations and I wonder why?
Here's an example:
// For the record
private bool getTrue(){return true;}
private bool getFalse(){return false;}
// Since a is true it wont enter getFalse.
bool a = getTrue() || getFalse();
// Since a is false it wont enter getTrue.
bool b = getFalse() && getTrue();
// Since b is false it wont enter getTrue.
b = b && getTrue();
However the bitwise operators "|=" and "&=" aren't as smart:
bool a = getTrue();
a |= getFalse(); // a has no chance to get false but it still enters the function.
a = getFalse();
a &= getTrue(); // a has no chance to get true but still performs this operation.
I wondered why they don't work in the same logical way.
One clarification:
The operators &=
and |=
are NOT bitwise operators when evaluated on bool
s - they are logical operators, but they are the equivalent of x = x & y
and x = x | y
, which do not short circuit like &&
and ||
do.
From MSDN:
The & operator performs a bitwise logical AND operation on integral operands and logical AND on bool operands.
The designers could have implemented ||=
and &&=
, but since they would only be appropriate for boolean types, there's not much value there.
D Stanley's answer is correct; your error is in thinking of &
as being "bitwise" when applied to bool
. It is better to think of &
and &&
as being the eager and lazy versions of logical AND when applied to bool
.
Now here's a question that you didn't ask, but is actually the more interesting question:
Why is there a non-short-circuiting version of AND and OR for
bool
in the first place? That is, why would you ever sayexprX & exprY
instead ofexprX && exprY
?
The bad reason is: expression exprY
might have a side effect that you want to always happen regardless of the value of exprX
. This is a bad reason because it's a questionable practice to use an expression both for its side effects and its value.
The good reason is: it can be faster to compute &
than &&
.
How is that possible? Surely if we can avoid computing the right hand side some of the time then we can always save time on average, right?
Wrong wrong wrong. z = x && y
will be generated as code that has the structure:
if x goto CONSEQUENCE
z = false
goto DONE
CONSEQUENCE: z = y
DONE:
That's a lot of instructions compared to simply computing x & y
and assigning the result to z
, and large code takes more time to load off disk, more time to jit compile, and uses up more space in the processor cache.
Moreover, those instructions contain a conditional branch and a non-conditional branch, increasing dramatically the number of basic blocks which the jitter must handle. (A "basic block" is a section of code with a clear start and finish such that if there are no exceptions, all the code in the basic block executes.) The jitter might choose to eschew certain optimizations when the number of basic blocks it has to analyze gets too large.
Worst of all, any conditional branch gives the CPU an opportunity for its branch predictor to make the wrong choice, which can have serious performance repercussions in some cases.
Now, this is not to say that you should never use &&
for bools; there's no program yet whose dramatic success in the marketplace is attributable to the use of this nano-optimization. I point it out merely because it is not entirely obvious why there should be a non-short-circuiting logical operator on bools at all.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With