Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What type-conversions are happening?

#include "stdio.h"

int main()
{
    int x = -13701;
    unsigned int y = 3;
    signed short z = x / y;

    printf("z = %d\n", z);

    return 0;
}

I would expect the answer to be -4567. I am getting "z = 17278". Why does a promotion of these numbers result in 17278?

I executed this in Code Pad.

like image 410
Robert Avatar asked Dec 31 '25 09:12

Robert


1 Answers

The hidden type conversions are:

signed short z = (signed short) (((unsigned int) x) / y);

When you mix signed and unsigned types the unsigned ones win. x is converted to unsigned int, divided by 3, and then that result is down-converted to (signed) short. With 32-bit integers:

(unsigned) -13701         == (unsigned) 0xFFFFCA7B // Bit pattern
(unsigned) 0xFFFFCA7B     == (unsigned) 4294953595 // Re-interpret as unsigned
(unsigned) 4294953595 / 3 == (unsigned) 1431651198 // Divide by 3
(unsigned) 1431651198     == (unsigned) 0x5555437E // Bit pattern of that result
(short) 0x5555437E        == (short) 0x437E        // Strip high 16 bits
(short) 0x437E            == (short) 17278         // Re-interpret as short

By the way, the signed keyword is unnecessary. signed short is a longer way of saying short. The only type that needs an explicit signed is char. char can be signed or unsigned depending on the platform; all other types are always signed by default.

like image 123
John Kugelman Avatar answered Jan 02 '26 00:01

John Kugelman