In this c
program
#include<stdio.h>
int main()
{
#if UnDefinedSymbolicConstant==0
printf("UnDefinedSymbolicConstant is equal to 0\n ");
#else
printf("UnDefinedSymbolicConstant is not equal to 0\n");
#endif
return 0;
}
UnDefinedSymbolicConstant has not been #define
anywhere, still it is being assumed as 0 and gives the output on gcc-4.3.4
as:
UnDefinedSymbolicConstant is equal to 0
so, Is this the standard behaviour or it works like this only in gcc?
The definition of a standard is something established as a rule, example or basis of comparison. An example of standard is a guideline governing what students must learn in the 7th grade. An example of standard is a piece of music that continues to be played throughout the years.
An IT Standard is a rule, principal, technique, process or template that is designed to provide consistency to the planning, development, operation and governance of information technology services. If you enjoyed this page, please consider bookmarking Simplicable. Cite »
Definition. A set of criteria within an industry relating to the standard functioning and carrying out of operations in their respective fields of production. In other words, it is the generally accepted requirements followed by the members of an industry.
Standards contain technical specifications or other precise criteria designed to be used consistently as a rule, guideline, or definition. They help to make life simpler and increase the reliability and the effectiveness of many of the goods and services we use.
Yes, this is specified by the standard in 6.10.1:
After all replacements due to macro expansion and the defined unary operator have been performed, all remaining identifiers (including those lexically identical to keywords) are replaced with the pp-number 0
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With