Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why an extra integer type among short/int/long?

Until recently I believed that 'long' was the same thing as 'int' because of historical reasons and desktop processors all having at least 32 bits (and had troubles only with that "dupe" since only developing on 32 bits machines).

Reading this, I discover that, in fact, the C standard defines the int type to be at least an int16, while 'long' is supposed to be at least an int32.

In fact in the list

  • Short signed integer type. Capable of containing at least the [−32767, +32767] range
  • Basic signed integer type. Capable of containing at least the [−32767, +32767] range;
  • Long signed integer type. Capable of containing at least the [−2147483647, +2147483647] range
  • Long long signed integer type. Capable of containing at least the [−9223372036854775807, +9223372036854775807] range;

there are always non-empty intersections, and therefore a duplicate, whatever implementation the compiler and platform choose.

Why did the standard commitee introduced an extra type among what could be as simple as char/short/int/long (or int_k, int_2k, int_4k, int_8k)?

Was that for historical reasons, like, gcc x.x implemented int as 32 bits while another compiler implemented it as 16, or is there a real technical reason I'm missing?

like image 266
Regis Portalez Avatar asked May 16 '16 14:05

Regis Portalez


1 Answers

The central point is that int/unsigned is not just another step of integer sizes from char, short,int, long, long long ladder. int is special. It is the size that all narrower types promote to and so typically works "best" on a given processor. So should int match short, long or is wedged distinctly between short/long is highly platform dependent.

C is designed to accommodate a wide range of processors. Given that C is 40+ years old is testament to a successfully strategy.

like image 97
chux - Reinstate Monica Avatar answered Nov 10 '22 19:11

chux - Reinstate Monica