Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

size of types in c/c++

Tags:

c++

c

sizeof

I recently asked a question here regarding the size of char. Looking at my question, it brought me to another question:
Are things like number of bits in a char or sizeof(int) CPU dependent, OS dependent, compiler dependent, or some combination of the above? Who decides the sizeof(int) in my compiler is 4?

EDIT: Let me explain: For example, my compiler on a 64bit system uses 32bit int. Is this set by the compiler or by the OS to be the standard int for all compilers on this (exact) OS/platform combination? How about char = 8 bits? Can a OS decide it is going to use 16bit chars? Can a compiler?

like image 343
Baruch Avatar asked Jul 22 '12 13:07

Baruch


1 Answers

According to all ISO C standards, all sizes are measured in multiples of the size of a char. This means, by definition sizeof(char) == 1. The size of a char in bits is defined by the macro CHAR_BIT in <limits.h> (or <climits>). The minimum size of a char is 8 bits.

Additional type restrictions are:

sizeof(char) <= sizeof(short int) <= sizeof(int) <= sizeof(long int)

int must be able to represent -32767 to +32767 - e.g. it must be at least 16 bits wide.

C99 added long long int, who's size is larger or equal to long int.

The rest is implementation-dependant. This means that the C compiler in question gets to choose how large exactly the numbers are.


How do the C compilers choose these sizes?

There are some common conventions most compilers follow. long is often chosen to be as large as a machine word. On 64bits machines where CHAR_BIT == 8 (this is nearly always the case, so I'll assume that for the rest of this answer) this means sizeof(long) == 8. On 32 bit machines this means sizeof(long) == 4.

int is almost always 32 bits wide.

long long int is often 64 bits wide.

like image 142
orlp Avatar answered Sep 18 '22 10:09

orlp