I'm having a hard time grasping data types in C. I'm going through a C book and one of the challenges asks what the maximum and minimum number a short
can store.
Using sizeof(short);
I can see that a short consumes 2 bytes. That means it's 16 bits, which means two numbers since it takes 8 bits to store the binary representation of a number. For example, 9 would be 00111001
which fills up one bit. So would it not be 0 to 99 for unsigned, and -9 to 9 signed?
I know I'm wrong, but I'm not sure why. It says here the maximum is (-)32,767 for signed, and 65,535 for unsigned.
short int, 2 Bytes, 16 Bits, -32,768 -> +32,767 Range (16kb)
This is defined in <limits.h>
, and is SHRT_MIN
& SHRT_MAX
.
The reference you read is correct. At least, for the usual C implementations where short
is 16 bits - that's not actually fixed in the standard.
16 bits can hold 2^16 possible bit patterns, that's 65536 possibilities. Signed shorts are -32768 to 32767, unsigned shorts are 0 to 65535.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With