Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

unsigned short vs unsigned int - sometimes they are the same range? [closed]

Tags:

c

types

int

short

What's the difference between unsigned short and unsigned int? I found that unsigned short is 0-65,535 and unsigned int is 0-65,535 or 0-4,294,967,295. I don't understand the difference very well. How can I know the size of data type in my architecture? And if for example c = (unsigned short) d; when c is an unsigned short and d is an unsigned int ; what is that mean? the first 16 bits from d are assigned to c?

like image 779
mpluse Avatar asked Mar 17 '13 01:03

mpluse


People also ask

Is unsigned short and unsigned short int are same?

There is no difference.

What is the difference between unsigned and unsigned int?

Difference between Signed Int and Unsigned IntA signed int can store negative values. Unsigned integer values can only store positive values. A signed integer can get an overflow error while used in a program with larger values.

Can signed and unsigned integers store the same number of values?

Both can store 256 different values, but signed integers use half of their range for negative numbers, whereas unsigned integers can store positive numbers that are twice as large.

What is the difference between short signed int and signed int?

It depends on the platform. Int is 32-bit wide on a 32-bit system and 64 bit wide on a 64-bit system(i am sure that this is ever the case). It depends on the C implementation. In an 'LP64' implementation, ints are 32 bit, long ints are 64 bit and pointers are 64bit.


2 Answers

This is a useful link to explain the history of C data types:

http://en.wikipedia.org/wiki/C_data_types

So the size of your data type is platform-dependent, but if your int is 32-bits in length then it will be able to represent one of 2^32 different numbers (0 - 4,294,967,295 if unsigned). Similarly if your short is 16-bits in length then it can represent one of 2^16 different numbers (0 - 65,535 if unsigned).

This link gives you the implementation details for Visual Studio 2005, where ints are 32-bits in size (4 bytes) and shorts are 16-bits (2 bytes):

http://msdn.microsoft.com/en-us/library/s3f49ktz(v=vs.80).aspx

Your exact implementation will depend on your compiler.

As for the last part of your question, yes if you attempt to cast down an int larger than the short's maximum value to a short then you will end up with a different value (probably the first 16 bits but you should test to be sure).

like image 138
gavinj500 Avatar answered Sep 26 '22 04:09

gavinj500


You're really asking what is the difference between short and int. The answer is that short may be narrower, but may also be the same width as, int. That's virtually all we know for sure, independent of platform. A lot of platforms have 32-bit int and 16-bit short, but not all.

like image 24
John Zwinck Avatar answered Sep 22 '22 04:09

John Zwinck