Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why are the bits of a std::bitset in reverse order? [duplicate]

Tags:

c++

bitset

Why does bitset store the bits in reverse order? After strugging many times I have finally written this binary_to_dec. Could it simplified?

int binary_to_dec(std::string bin)
{
    std::bitset<8> bit;

    int c = bin.size();

    for (size_t i = 0; i < bin.size(); i++,c--)
    {
        bit.set(c-1, (bin[i]-'0' ? true : false));
    }

    return bit.to_ulong();
}
like image 749
user4344 Avatar asked Feb 11 '11 23:02

user4344


People also ask

How do you reverse a bitset?

The bitset flip() method is an inbuilt method of C++ STL( Standard Template Library). It flips the bits of the calling bitset. This method flips all 0's to 1's and all 1's to 0's, which means it reverse each and every bit of the calling bitset when no parameter is passed.

Is bitset faster than an array of bools?

As bitset stores the same information in compressed manner the operation on bitset are faster than that of array and vector.

What does bitset mean in C++?

Bitset is a container in C++ Standard Template Library for dealing with data at the bit level. 1. A bitset stores bits (elements with only two possible values: 0 or 1). We can however get the part of a string by providing positions to bitset constructor (Positions are with respect to string position from left to right)

How is bitset implemented in C++?

Let's implement bitset in C++, such that following operations can be performed in stated time complexities : init(int size): initializes a bitset of size number of 0 bits. void fix(int pos): Change the bit at position pos to 1.


2 Answers

Bitset stores its numbers in what you consider to be "reverse" order because we write the digits of a number in decreasing order of significance even though the characters of a string are arranged in increasing index order.

If we wrote our numbers in little-endian order, then you wouldn't have this confusion because the character at index 0 of your string would represent bit 0 of the bitset. But we write our numbers in big-endian order. I'm afraid I don't know the details of human history that led to that convention. (And note that the endianness that any particular CPU uses to store multi-byte numbers is irrelevant. I'm talking about the endianness we use when displaying numbers for humans to read.)

For example, if we write the decimal number 12 in binary, we get 1100. The least significant bit is on the right. We call that "bit 0." But if we put that in a string, "1100", the character at index 0 of that string represents bit 3, not bit 0. If we created a bitset with the bits in the same order as the characters, to_ulong would return 3 instead of 12.

The bitset class has a constructor that accepts a std::string, but it expects the index of the character to match the index of the bit, so you need to reverse the string. Try this:

int binary_to_dec(std::string const& bin)
{
  std::bitset<8> bit(std::string(bin.rbegin(), bin.rend()));
  return bit.to_ulong();
}
like image 88
Rob Kennedy Avatar answered Oct 21 '22 21:10

Rob Kennedy


unsigned long binary_to_dec(std::string bin)
{
    std::bitset<sizeof(unsigned long)*8> bits(bin);
    return bits.to_ulong();
}

EDIT: formatting and return type.

like image 27
Jeremy CD Avatar answered Oct 21 '22 20:10

Jeremy CD