printf("%f", 1.0); //prints 1.0
but
printf("%f", 1);  // prints 0.0
How did the conversion happen?
printf("%f", 1); causes undefined behavior, because double is expected, but you passed an int. There is no explanation of why it prints 0.0, because the behavior is undefined. 
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With