According to the specification this operator is called bit clear:
&^ bit clear (AND NOT) integers
I've never heard of such an operator before, and I'm wondering why is it useful.
It seems to take the left operand and disables all the bits that are turned on in the right operand. Is there any formal description of the operator?
One more thing I noticed is that it's not commutative.
Pseudocode in comarison with ^
:
11110 &^ 100 //11010
11110 ^ 100 //11010
11110 &^ 0 //11110
11110 ^ 0 //11110
11110 &^ 11110 //0
11110 ^ 11110 //0
11110 &^ 111 //11000
11110 ^ 111 //11001
111 &^ 11110 //1
111 ^ 11110 //11001
From the symbol (a concatenation of &
and ^
), the name "and not" (and also the term "bit clear" which sounds like the opposite of "bit set"), it seems evident that A &^ B
is doing A & ^B
(where ^
is the bitwise inverse).
This is backed up by examining the operator's truth table:
fmt.Println(0 &^ 0); // 0
fmt.Println(0 &^ 1); // 0
fmt.Println(1 &^ 0); // 1
fmt.Println(1 &^ 1); // 0
(See http://ideone.com/s4Pfe9.)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With