Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is the sizeof(enum) == sizeof(int), always?

Is the sizeof(enum) == sizeof(int), always ?

  • Or is it compiler dependent?
  • Is it wrong to say, as compiler are optimized for word lengths (memory alignment) ie y int is the word-size on a particular compiler? Does it means that there is no processing penalty if I use enums, as they would be word aligned?
  • Is it not better if I put all the return codes in an enum, as i clearly do not worry about the values it get, only the names while checking the return types. If this is the case wont #DEFINE be better as it would save memory.

What is the usual practice? If I have to transport these return types over a network and some processing has to be done at the other end, what would you prefer enums/#defines/ const ints.

EDIT - Just checking on net, as complier don't symbolically link macros, how do people debug then, compare the integer value with the header file?

From Answers —I am adding this line below, as I need clarifications—

"So it is implementation-defined, and sizeof(enum) might be equal to sizeof(char), i.e. 1."

  • Does it not mean that compiler checks for the range of values in enums, and then assign memory. I don't think so, of course I don't know. Can someone please explain me what is "might be".
like image 230
Vivek Sharma Avatar asked Jul 11 '09 14:07

Vivek Sharma


People also ask

What is the sizeof enum?

In C language, an enum is guaranteed to be of size of an int . There is a compile time option ( -fshort-enums ) to make it as short (This is mainly useful in case the values are not more than 64K). There is no compile time option to increase its size to 64 bit.

Does enum have size?

The C standard specifies that enums are integers, but it does not specify the size. Once again, that is up to the people who write the compiler. On an 8-bit processor, enums can be 16-bits wide. On a 32-bit processor they can be 32-bits wide or more or less.

How do you find the enum size?

In both C and C++, the size of an enumerator sizeof(enumStruc_2) is the size of any individual element in that enumeration. In C, the answer is sizeof(int) .

Why is enum size 4?

The size is four bytes because the enum is stored as an int . With only 12 values, you really only need 4 bits, but 32 bit machines process 32 bit quantities more efficiently than smaller quantities.


2 Answers

It is compiler dependent and may differ between enums. The following are the semantics

enum X { A, B };  // A has type int assert(sizeof(A) == sizeof(int));  // some integer type. Maybe even int. This is // implementation defined.  assert(sizeof(enum X) == sizeof(some_integer_type)); 

Note that "some integer type" in C99 may also include extended integer types (which the implementation, however, has to document, if it provides them). The type of the enumeration is some type that can store the value of any enumerator (A and B in this case).

I don't think there are any penalties in using enumerations. Enumerators are integral constant expressions too (so you may use it to initialize static or file scope variables, for example), and i prefer them to macros whenever possible.

Enumerators don't need any runtime memory. Only when you create a variable of the enumeration type, you may use runtime memory. Just think of enumerators as compile time constants.

I would just use a type that can store the enumerator values (i should know the rough range of values before-hand), cast to it, and send it over the network. Preferably the type should be some fixed-width one, like int32_t, so it doesn't come to conflicts when different machines are involved. Or i would print the number, and scan it on the other side, which gets rid of some of these problems.


Response to Edit

Well, the compiler is not required to use any size. An easy thing to see is that the sign of the values matter - unsigned types can have significant performance boost in some calculations. The following is the behavior of GCC 4.4.0 on my box

int main(void) {   enum X { A = 0 };   enum X a; // X compatible with "unsigned int"   unsigned int *p = &a; } 

But if you assign a -1, then GCC choses to use int as the type that X is compatible with

int main(void) {   enum X { A = -1 };   enum X a; // X compatible with "int"   int *p = &a; } 

Using the option --short-enums of GCC, that makes it use the smallest type still fitting all the values.

int main() {   enum X { A = 0 };   enum X a; // X compatible with "unsigned char"   unsigned char *p = &a; } 

In recent versions of GCC, the compiler flag has changed to -fshort-enums. On some targets, the default type is unsigned int. You can check the answer here.

like image 180
Johannes Schaub - litb Avatar answered Sep 19 '22 00:09

Johannes Schaub - litb


C99, 6.7.2.2p4 says

Each enumerated type shall be compatible with char, a signed integer type, or an unsigned integer type. The choice of type is implementation-defined,108) but shall be capable of representing the values of all the members of the enumeration. [...]

Footnote 108 adds

An implementation may delay the choice of which integer type until all enumeration constants have been seen.

So it is implementation-defined, and sizeof(enum) might be equal to sizeof(char), i.e. 1.

In chosing the size of some small range of integers, there is always a penalty. If you make it small in memory, there probably is a processing penalty; if you make it larger, there is a space penalty. It's a time-space-tradeoff.

Error codes are typically #defines, because they need to be extensible: different libraries may add new error codes. You cannot do that with enums.

like image 22
Martin v. Löwis Avatar answered Sep 20 '22 00:09

Martin v. Löwis