I have a doubt why it happens that different compilers give different outputs to a same C program.If there is an standard C then why these famous compiler dont use that completely. the difference in output is caused by 16-bit , 32-bit compilers so what are all those issues which makes the difference.
Do you have an example?
The language is standardized, but a lot of aspects of it are implementation-defined or even undefined.
For example, this:
printf("sizeof (int) = %u\n", (unsigned)sizeof (int));
will print different numbers on different systems, depending on how big int is.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With