I considered the C++11-based enum bitset introduced here. I came up with some sample program:
#include <bitset>
#include <type_traits>
#include <limits>
template <typename TENUM>
class FlagSet {
private:
using TUNDER = typename std::underlying_type<TENUM>::type;
std::bitset<std::numeric_limits<TUNDER>::max()> m_flags;
public:
FlagSet() = default;
FlagSet(const FlagSet& other) = default;
};
enum class Test
{
FIRST,
SECOND
};
int main(int argc, char *argv[])
{
FlagSet<Test> testFlags;
return 0;
}
The program compiles just fine using clang++ (clang version 3.8.1 (tags/RELEASE_381/final)) via clang++ -std=c++11 -o main main.cc
.
However, if I use g++ (g++ (GCC) 6.2.1 20160830) via g++ -std=c++11 -o main main.cc
instead, the compiler eventually exhausts system memory. Is this an issue with g++ or is this code somehow not compliant with the standard?
std::bitset<std::numeric_limits<TUNDER>::max()>
is 256 MiB in size (assuming 32-bit int
). It's great that clang successfully compiles it, but it's not particularly surprising that gcc runs out of memory.
If you're intending to use the enumerators as bitset indices you'll have to pass the largest enumerator in as a separate template parameter; there is as yet (Max and min values in a C++ enum) no way to find the range of an enumeration.
Example:
template <typename TENUM, TENUM MAX>
class FlagSet {
private:
std::bitset<MAX + 1> m_flags;
public:
FlagSet() = default;
FlagSet(const FlagSet& other) = default;
};
enum class Test
{
FIRST,
SECOND,
MAX = SECOND
};
FlagSet<Test, Test::MAX> testFlags;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With