Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Size of enums in bytes of different compilers [duplicate]

is the size of an enum always the same among different compilers (gcc, visual c and others?). That is, does sizeof() of a specific enum gives the same values with every compiler that follow the C/C++ standards?

like image 368
Luke Avatar asked Sep 07 '11 13:09

Luke


People also ask

What is the size of enum in bytes?

The size is four bytes because the enum is stored as an int . With only 12 values, you really only need 4 bits, but 32 bit machines process 32 bit quantities more efficiently than smaller quantities. Without enums, you might be tempted to use raw integers to represent the months.

What is the size of an enum?

On an 8-bit processor, enums can be 16-bits wide. On a 32-bit processor they can be 32-bits wide or more or less. The GCC C compiler will allocate enough memory for an enum to hold any of the values that you have declared. So, if your code only uses values below 256, your enum should be 8 bits wide.

How do you calculate enums bytes?

typedef enum enumStruc_2 { ENUM_5 = 0xFFFFFFF0, ENUM_6, ENUM_7, ENUM_8 } enumStruc_2; Here, size of enum is 4 bytes(Integer size).

Are enums 32 bit?

Default. The default is -fno-short-enums . That is, the size of an enumeration type is at least 32 bits regardless of the size of the enumerator values.


2 Answers

No.

In both C and C++ an enum will have a size such that all the values can be represented and be compatible with an integer type. Different compilers may use different algorithm to choose the type (if it is not specified by another standard such a clearly defined ABI). (C++11 allows to specify the underlying type with a new syntax)

like image 67
AProgrammer Avatar answered Nov 03 '22 01:11

AProgrammer


"Each enumerated type shall be compatible with char, a signed integer type, or an unsigned integer type. The choice of type is implementation-defined) but shall be capable of representing the values of all the members of the enumeration."

"...An implementation may delay the choice of which integer type until all enumeration constants have been seen."

ISO/IEC 9899:1999 (E) p.105

So we only have upper boundaries for sizeof(enum). On most systems i had sizeof(enum) = 4, but STM compiler made sizeof(enum) = 1/2/4 depending on values written in enum

Edit: it seems that you can set one of your enum's value to max int to ensure that compiler chooses integer as enum size.

like image 25
Dmitrii Z. Avatar answered Nov 03 '22 00:11

Dmitrii Z.