Here is my code:
#include<iostream>
#include<string>
#include<bitset>
using namespace std;
int main()
{
long long int number;
number = 123456789123456789;
string binary = bitset<64>(number).to_string();
cout<<binary<<"\n";
return 0;
}
Here is the result: 0000000000000000000000000000000010101100110100000101111100010101 but it's wrong.
Info from comments (and OPs experiment):
Same code results in expected binary representation in other environments.
What is the reason?
I can reproduce the problem under the following circumstances:
For example, with GCC 10.2, this happens with -m32 -std=c++03, with GCC 4.9.2 just with -m32. Until C++11, there was no unsigned long long defined by the standard. Though C++98/03 implementations may provide it as a non-standard extension, the parameter of the constructor of std::bitset was of unsigned long type only. Which, in the above-described cases, is only 32-bit long. This is where you lost the upper bits.
Live demo: https://godbolt.org/z/PWdY8K
The relevant part in libstdc++ is here:
#if __cplusplus >= 201103L
constexpr bitset(unsigned long long __val) noexcept
: _Base(_Sanitize_val<_Nb>::_S_do_sanitize_val(__val)) { }
#else
bitset(unsigned long __val)
: _Base(__val)
{ _M_do_sanitize(); }
#endif
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With