As documented here, std::bitset::operator^=
returns *this
. From that and from the "usual" interpretation of operators such as +=, |=, *=
one could reasonably assume that given std::bitset
instances (of the same size) a
and b
, the expression (a^=b).count()
will store the result of a bitwise XOR
operation in a
, and that count()
would return the number of bits in a
that are set to true
. However, as the following minimal example demonstrates, something unexpected happens:
#include <iostream>
#include <bitset>
int main()
{
constexpr unsigned int N=6;
std::bitset<N> a;
std::bitset<N> b;
a.flip();//111111
b[0]=1;
b[4]=1;//b is now 010001 (assuming least significan bit on the right end of the string)
std::cout<<"a=="<<a.to_string()<<std::endl;
std::cout<<"b=="<<b.to_string()<<std::endl;
std::cout<<"(a xor b) to string=="<<(a^=b).to_string()<<std::endl;
//Here is the unexpected part!
std::cout<<"(a xor b) count=="<<(a^=b).count()<<std::endl;
//Note that the following lines would produce the correct result
//a^=b;
//std::cout<<a.count()<<std::endl;
return 0;
}
The output is
a==111111
b==010001
(a xor b) to string==101110
(a xor b) count==6 //this is wrong!!!!! It should be 4...
A quick look at the implementation of std::bitset
(see here) seems to indicate that the reference that is returned is indeed a reference to the lhs object (a
in my example). So... Why is this happening?
This has nothing to do with the bitset. Consider this code:
int a = 2;
int b = 3;
std::cout << std::to_string(a *= b) << std::endl; // Prints 6.
std::cout << std::to_string(a *= b) << std::endl; // Prints 18.
You are using an assignment operator, so your variable/bitset changes every time. In your case, the second evaluation yields ((a ^ b) ^ b)
, which is of course the original a
(which did have 6 bits set).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With