Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

printf displays something weird

Tags:

c

There is such code:

#include <stdio.h>

int main() {
  float d = 1.0;
  int i = 2;
  printf("%d %d", d, i);
  getchar();
  return 0;
}

And the output is:

0 1072693248

I know that there is error in printf and first %d should be replaced with %f. But why variable i is printed wrong (1072693248 instead of 2)?

like image 585
scdmb Avatar asked Sep 15 '11 16:09

scdmb


2 Answers

Since you specified %d instead of %f, what you're really seeing is the binary representation of d as an integer.

Also, since the datatypes don't match, the code actually has undefined behavior.

EDIT:

Now to explain why you don't see the 2:

float gets promoted to double on the stack. Type double is (in this case) 8 bytes long. However, since your printf specifies two integers (both 4 bytes in this case), you are seeing the binary representations of 1.0 as a type double. The 2 isn't printed because it is beyond the 8 bytes that your printf expects.

like image 193
Mysticial Avatar answered Oct 24 '22 09:10

Mysticial


printf doesn't just use the format codes to decide how to print its arguments. It uses them to decide how to access its arguments (it uses va_arg internally). Because of this, when you give the wrong format code for the first argument (%d instead of %f) you don't just mess up the printing of the first argument, you make it look in the wrong place for all subsequent arguments. That's why you're getting nonsense for the second argument.

like image 44
zwol Avatar answered Oct 24 '22 08:10

zwol