Why does bitset store the bits in reverse order? After strugging many times I have finally written this binary_to_dec. Could it simplified?
int binary_to_dec(std::string bin)
{
std::bitset<8> bit;
int c = bin.size();
for (size_t i = 0; i < bin.size(); i++,c--)
{
bit.set(c-1, (bin[i]-'0' ? true : false));
}
return bit.to_ulong();
}
The bitset flip() method is an inbuilt method of C++ STL( Standard Template Library). It flips the bits of the calling bitset. This method flips all 0's to 1's and all 1's to 0's, which means it reverse each and every bit of the calling bitset when no parameter is passed.
As bitset stores the same information in compressed manner the operation on bitset are faster than that of array and vector.
Bitset is a container in C++ Standard Template Library for dealing with data at the bit level. 1. A bitset stores bits (elements with only two possible values: 0 or 1). We can however get the part of a string by providing positions to bitset constructor (Positions are with respect to string position from left to right)
Let's implement bitset in C++, such that following operations can be performed in stated time complexities : init(int size): initializes a bitset of size number of 0 bits. void fix(int pos): Change the bit at position pos to 1.
Bitset stores its numbers in what you consider to be "reverse" order because we write the digits of a number in decreasing order of significance even though the characters of a string are arranged in increasing index order.
If we wrote our numbers in little-endian order, then you wouldn't have this confusion because the character at index 0 of your string would represent bit 0 of the bitset. But we write our numbers in big-endian order. I'm afraid I don't know the details of human history that led to that convention. (And note that the endianness that any particular CPU uses to store multi-byte numbers is irrelevant. I'm talking about the endianness we use when displaying numbers for humans to read.)
For example, if we write the decimal number 12 in binary, we get 1100. The least significant bit is on the right. We call that "bit 0." But if we put that in a string, "1100"
, the character at index 0 of that string represents bit 3, not bit 0. If we created a bitset with the bits in the same order as the characters, to_ulong
would return 3 instead of 12.
The bitset class has a constructor that accepts a std::string
, but it expects the index of the character to match the index of the bit, so you need to reverse the string. Try this:
int binary_to_dec(std::string const& bin)
{
std::bitset<8> bit(std::string(bin.rbegin(), bin.rend()));
return bit.to_ulong();
}
unsigned long binary_to_dec(std::string bin)
{
std::bitset<sizeof(unsigned long)*8> bits(bin);
return bits.to_ulong();
}
EDIT: formatting and return type.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With