Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

If the size of "long" and "int" are the same on a platform - are "long" and "int" different in any way?

If the representation of a long int and a int are the same on a platform, are they strictly the same? Do the types behave any differently on the platform in any way according to the C standard?

Eg. does this always work:

int int_var;
long long_var;

void long_bar(long *l);
void int_bar(int *i);

void foo() 
{
    long_bar(&int_var); /* Always OK? */
    int_bar(&long_var);
}

I guess the same question applies to short and int, if they happen to be the same representation.

The question arose when discussing how to define a int32_t-like typedef for an embedded C89 compiler without stdint.h, i.e. as int or long and if it would matter.

like image 282
Vilhelm Avatar asked Mar 29 '21 07:03

Vilhelm


People also ask

Why are int and long int the same size?

Compiler designers tend to to maximize the performance of int arithmetic, making it the natural size for the underlying processor or OS, and setting up the other types accordingly. But the use of long int , since int can be omitted, it's just the same as long by definition.

Are long and long int the same?

long and long int are identical. So are long long and long long int . In both cases, the int is optional.

What is the difference between int and long int?

An int is a 32-bit integer; a long is a 64-bit integer. Which one to use depends on how large the numbers are that you expect to work with. int and long are primitive types, while Integer and Long are objects.

How much bigger is a long than int?

You will find some platforms where int is 32 bits, long is 64 bits, and long long is 128 bits, but it seems very common for sizeof (long) to be 4.


3 Answers

They are not compatible types, which you can see with a a simple example:

int* iptr;
long* lptr = iptr; // compiler error here

So it mostly matters when dealing with pointers to these types. Similarly, there is the "strict aliasing rule" which makes this code undefined behavior:

int i;
long* lptr = (long*)&i;
*lptr = ...;  // undefined behavior

Some another subtle issue is implicit promotion. In case you have some_int + some_long then the resulting type of that expression is long. Or in case either parameter is unsigned, unsigned long. This is because of integer promotion through the usual arithmetic conversions, see Implicit type promotion rules. Shouldn't matter most of the time, but code such as this will fail: _Generic(some_int + some_long, int: stuff() ) since there is no long clause in the expression.

Generally, when assigning values between types, there shouldn't be any problems. In case of uint32_t, it doesn't matter which type it corresponds to, because you should treat uint32_t as a separate type anyway. I'd pick long for compatibility with small microcontrollers, where typedef unsigned int uint32_t; will break. (And obviously, typedef signed long int32_t; for the signed equivalent.)

like image 115
Lundin Avatar answered Oct 23 '22 23:10

Lundin


The types long and int have different ranks. The rank of the type long is higher than the rank of the type int. So in a binary expression where there are used an object of the type long and an object of the type int the last is always converted to the type long.

Compare the following code snippets.

int x = 0;
unsigned int y = 0;

the type of the expression x + y is unsigned int.

long x = 0;
unsigned int y = 0;

the type of the expression x + y is unsigned long (due to the usual arithmetic conversions) provided that sizeof( int ) is equal to sizeof( long).

This is very important in C++ than in C where function overloading are allowed.

In C you have to take this into account for example when you are using i/o functions as for example printf to specify a correct conversion specifier.

like image 35
Vlad from Moscow Avatar answered Oct 23 '22 22:10

Vlad from Moscow


Even on platforms where long and int have the same representation, the Standard would allow compilers to be willfully blind to the possibility that the act of storing a value to a long* might affect the value of an int* or vice versa. Given something like:

#include <stdint.h>

void store_to_int32(void *p, int index)
{
    ((int32_t*)p)[index] = 2;
}
int array1[10];
int test1(int index)
{
    array1[0] = 1;
    store_to_int32(array1, index);
    return array1[0];
}
long array2[10];
long test2(int index)
{
    array2[0] = 1;
    store_to_int32(array2, index);
    return array2[0];
}

The 32-bit ARM version of gcc will treat int32_t as synonymous with long and ignore the possibility that passing the address of to array1 to store_to_int32 might cause the first element of that array to be written, and the 32-bit version of clang will treat int32_t as synonymous with int and ignore the possibility that passing the address of array2 to store_to_int32 might cause that array's first element to be written.

To be sure, nothing in the Standard would prohibit compilers from behaving in that fashion, but I think the Standard's failure to prohibit such blindness stems from the principle "the dumber something would be, the less need there should be to prohibit it".

like image 31
supercat Avatar answered Oct 23 '22 22:10

supercat