Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Range of values in C Int and Long 32 - 64 bits

Tags:

c

int

32bit-64bit

I'm confused with range of values of Int variable in C.

I know that a 32bits unsigned int have a range of: 0 to 65,535. So long has 0 to 4,294,967,295

This is fine in 32bits machine. But now in 64bits machines all thing keep the same? Or maybe my int capacity is different?

I understand this questions as newbie, but I'm really confused. This method signature is not helping too. :)

unsigned long long int atomicAdd(unsigned long long int* address, unsigned long long int val); 
like image 338
Custodio Avatar asked May 27 '11 17:05

Custodio


People also ask

What is the range of int in 32-bit?

A signed integer is a 32-bit datum that encodes an integer in the range [-2147483648 to 2147483647]. An unsigned integer is a 32-bit datum that encodes a nonnegative integer in the range [0 to 4294967295].

What is the range of 64-bit integer?

A 64-bit signed integer. It has a minimum value of -9,223,372,036,854,775,808 and a maximum value of 9,223,372,036,854,775,807 (inclusive).

What is the range of 64 bits long data type?

Holds signed 64-bit (8-byte) integers ranging in value from -9,223,372,036,854,775,808 through 9,223,372,036,854,775,807 (9.2... E+18).

Is long int 32 or 64-bit?

Windows: long and int remain 32-bit in length, and special new data types are defined for 64-bit integers.


2 Answers

In C and C++ you have these least requirements (i.e actual implementations can have larger magnitudes)

signed char: -2^07+1 to +2^07-1 short:       -2^15+1 to +2^15-1 int:         -2^15+1 to +2^15-1 long:        -2^31+1 to +2^31-1 long long:   -2^63+1 to +2^63-1 

Now, on particular implementations, you have a variety of bit ranges. The wikipedia article describes this nicely.

like image 144
Johannes Schaub - litb Avatar answered Oct 13 '22 21:10

Johannes Schaub - litb


No, int in C is not defined to be 32 bits. int and long are not defined to be any specific size at all. The only thing the language guarantees is that sizeof(char)<=sizeof(short)<=sizeof(long).

Theoretically a compiler could make short, char, and long all the same number of bits. I know of some that actually did that for all those types save char.

This is why C now defines types like uint16_t and uint32_t. If you need a specific size, you are supposed to use one of those.

like image 27
T.E.D. Avatar answered Oct 13 '22 21:10

T.E.D.