Possible Duplicate:
Difference between format specifiers %i and %d in printf
I just checked the reference, it says both of them indicate signed integer. I thought there must be some difference
There is no difference.
From the C99 standard document, section 7.19.6.1:
d, i
The int argument is converted to signed decimal in the style [−]dddd. The precision specifies the minimum number of digits to appear; if the value being converted can be represented in fewer digits, it is expanded with leading zeros. The default precision is 1. The result of converting a zero value with a precision of zero is no characters
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With