Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why do I get the same value when printing this int?

Tags:

c

gcc

So I'm just tinkering around with C and wanted to see if I could assign a binary value to an integer and use the printf() function to output either a signed or unsigned value. But regardless I get the same output, I thought I'd get half the value for printing the signed compared to the unsigned. I'm using Code::blocks and GCC.

Does printf() ignore the %i & %u and use the variable definition?

Sample Code:

#include <stdio.h>
#include <stdlib.h>

int main()
{
    signed int iNumber = 0b1111111111111111;
    printf("Signed Int : %i\n", iNumber);
    printf("Unsigned Int : %u\n", iNumber);

    return 0;
}

Same result if I change the int to unsigned:

#include <stdio.h>
#include <stdlib.h>

int main()
{
    unsigned int iNumber = 0b1111111111111111;
    printf("Signed Int : %i\n", iNumber);
    printf("Unsigned Int : %u\n", iNumber);

    return 0;
}
like image 861
ProfessionalAmateur Avatar asked Oct 22 '10 20:10

ProfessionalAmateur


3 Answers

I assume charCount should be iNumber. Both programs have undefined behavior, since you're using the wrong conversion specifier once.

In practice (for most implementations), printf relies on you to tell it what to pop off the stack; this is necessary because it's a variable argument function. va_arg takes the type to pop as the second parameter

The bits are the same between the programs before and after assignment. So printf is pointing to the same bits with pointers of different types.

The reason you get the same result for %i and %u is that the leftmost bit is unset so it's interpreted as positive for both specifiers.

Finally, you should note that binary literals (0b) are a GCC extension, not standard C.

like image 133
Matthew Flaschen Avatar answered Nov 12 '22 03:11

Matthew Flaschen


Because for positive numbers, the binary representation on most platforms is the same between signed and unsigned.

like image 43
Puppy Avatar answered Nov 12 '22 04:11

Puppy


First, you seem to be printing something called charCount, without error, and that b way of specifying a number isn't standard C. I'd check to be sure it was doing what you think it is. For a standard way of specifying bit patterns like that, use octal (number begins with a zero) or hex (number begins with a zero and x) formats.

Second, almost all computers have the same binary representation for a positive integer and its unsigned equivalent, so there will be no difference. There will a difference if the number is negative, and that depends typically on the most significant bit. Your ints could be of any size from 16 bits on up, although on a desktop, laptop, or server it's very probably 32, and almost certainly either 32 or 64.

Third, printf() knows nothing about the type of the data you pass to it. When called, it is even incapable of knowing the sizes of its arguments, or how many there are. It derives that from the format specifiers, and if those don't agree with the arguments passed there can be problems. It's probably the worst thing about printf().

like image 2
David Thornley Avatar answered Nov 12 '22 04:11

David Thornley