Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Strange printf output [duplicate]

I executed the following code

#include <stdio.h>

int main()
{
    printf("%f\n", 9/5);
}

Output : 0.000000

why not 1 ?

if i write printf("%f %f %d %d\n", (float)9/5, 4, sizeof(float), sizeof(int));

then output is 1.800000 0.000000 4 59

why not 1.800000 4 4 4

on my machine the sizeof (float) is 4

Thanks in advance

like image 732
chinmayaposwalia Avatar asked Apr 30 '26 09:04

chinmayaposwalia


1 Answers

This is because your printf format specifier doesn't match what you passed it:

9/5 is of type int. But printf expects a float.

So you need to either cast it to a float or make either literal a float:

printf("%f\n", (float)9/5);
printf("%f\n", 9./5);

As for why you're getting 0.0, it's because the printf() is reading the binary representation of 1 (an integer) and printing it as a float. Which happens to be a small denormalized value that is very close to 0.0.

EDIT : There's also something going with type-promotion on varargs.

In vararg functions, float is promoted to double. So printf() in this case actually expects a 64-bit parameter holding a double. But you only passed it a 32-bit operand so it's actually reading an extra 32-bits from the stack (which happens to be zero in this case) - even more undefined behavior.

like image 79
Mysticial Avatar answered May 02 '26 21:05

Mysticial