Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How does printf know when it's being passed a signed int

Tags:

c

linux

I'm trying to figure out how variables really work in C and find it strange how the printf function seems to know the difference between different variables of the same size, I'm assuming they're both 16 bits.

#include <stdio.h>

int main(void) {
    unsigned short int positive = 0xffff;
    short int negative = 0xffff;

    printf("%d\n%d\n", positive, negative);
return 0;
}

Output:

65535
-1
like image 762
Paul Lord Avatar asked Nov 07 '22 11:11

Paul Lord


1 Answers

I think we have to more carefully distinguish between the type conversions on the different integer types on one hand and the printf format specifiers (allowing to force printf how to interpret the data) on the other hand.

Systematically:

printf("%hd %hd\n", positive, negative);
// gives: -1 -1

Both values are interpreted as signed short int by printf, regardless of the declaration.

printf("%hu %hu\n", positive, negative);
// gives: 65535 65535

Both values are interpreted as unsigned short int by printf, regardless of the declaration.

However,

printf("%d %d\n", positive, negative);
// gives: 65535 -1

Both values are implicitly converted to (a longer) int, while the sign is kept.

Finally,

printf("%u %u\n", positive, negative);
// gives 65535 4294967295

Again, both values are implicitly converted to int, while the sign is kept, but then the negative value is interpreted as unsigned. As we can see here, plain int is actually 32-bit (on this system).


Curiously, only if I compile with gcc and -Wpedantic, it gives me a warning for the assignment short int negative = 0xffff;.

like image 131
Gerd Avatar answered Nov 15 '22 04:11

Gerd