Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

enums exceeding the size of the largest number type

Tags:

c++

enums

I want to fully understand how a C++ compiler deals with an enum exceeding the largest possible number, i.e., containing -1 and UINT64_MAX the at the same time, i.e.

enum A {
    X = -1,
    Y = UINT64_MAX
};

First I thought that a compiler won't accept this code. Actually it doesn't compile when enumis replaced by enum class, but the above example compiles. According to the standard we have for the underlying type:

Declares an unscoped enumeration type whose underlying type is not fixed (in this case, the underlying type is an implementation-defined integral type that can represent all enumerator values; this type is not larger than int unless the value of an enumerator cannot fit in an int or unsigned int. If the enumerator-list is empty, the underlying type is as if the enumeration had a single enumerator with value 0). (https://en.cppreference.com/w/cpp/language/enum)

But what means this for my example?

I wrote a small sample program to find out what happens:

#include <iostream>
#include <cstdint>

enum A {
    X = -1,
    XX = -1,
    Y = UINT64_MAX
};

int main()
{

    std::cout << "X unsigned: " << (uint64_t)(X) << ", signed: " << (int64_t)(X) << std::endl;
    std::cout << "Y unsigned: " << (uint64_t)(Y) << ", signed: " << (int64_t)(Y) << std::endl;

    std::cout << "(X == XX) == " << (X == XX) << std::endl;
    std::cout << "(X == Y) == " << (X == Y) << std::endl;
}

The output is:

X unsigned: 18446744073709551615, signed: -1
Y unsigned: 18446744073709551615, signed: -1
(X == XX) == 1
(X == Y) == 0

Now I am quite confused. Obviously, X and Y represent the same number, but they are still distinguishable, i.e., the comparison X == Y is false (but X=XX is actually true). What happens here?

I know, the better way is not use the old enum, but the new enum class. But still enum is widely used and I want to understand what happens here.

like image 931
Patrick Roocks Avatar asked Jan 04 '19 14:01

Patrick Roocks


People also ask

What is the size of enum data type?

On an 8-bit processor, enums can be 16-bits wide. On a 32-bit processor they can be 32-bits wide or more or less. The GCC C compiler will allocate enough memory for an enum to hold any of the values that you have declared. So, if your code only uses values below 256, your enum should be 8 bits wide.

Is there a limit on enums?

In theory, an ENUM column can have a maximum of 65,535 distinct values; in practice, the real maximum depends on many factors. ENUM values are represented internally as integers.

Why is enum 4 bytes?

The size is four bytes because the enum is stored as an int . With only 12 values, you really only need 4 bits, but 32 bit machines process 32 bit quantities more efficiently than smaller quantities. Without enums, you might be tempted to use raw integers to represent the months.

What is the default size of an enum?

Objects of type enum are int types, and their size is system-dependent. By default, objects of enum types are treated as 16-bit objects of type unsigned short when transmitted over a network.


1 Answers

Your compiler is most likely using a 128 bit signed integral type as the backing type, in concurrence with the C++ standard.

See for yourself with

std::cout << sizeof(std::underlying_type<A>::type);

Link: https://ideone.com/z4K0rz, outputs 16.

The output you observe is consistent with a narrowing conversion of this to a 64 bit unsigned type.

like image 162
Bathsheba Avatar answered Oct 09 '22 08:10

Bathsheba