I have a habit of using the following syntax in my compile-time flags:
#if (defined(A) & defined(B))
It's usually suggested that I do it with the &&
as follows:
#if (defined(A) && defined(B))
I know the difference between the two operators, and that in normal code &&
would short-circuit. However, the above is all handled by the compiler. Does it even matter what I use? Does it affect compile time by some infinitesimal amount because it doesn't evaluate the second define()
?
Does references the performance or achievements of another. An example of does is telling a friend that your husband is in marketing, "He does marketing." Plural form of doe. Third-person singular simple present indicative form of do.
In questions, “do” or “does” usually starts the sentence, but it doesn't have to. For a simple interrogative sentence, or question, “do” or “does” is typically followed by the subject, and then the conjugated verb.
We use do with nouns such as homework, job, task, work: She has a lot of homework to do tonight. I'm going to do some work in the garden this weekend.
As detailed above, 'does' can be a verb or a noun.
Since defined(SOMETHING)
yields 0 or 1, so that you're guaranteed 0 or 1 on both sides, it doesn't make a technical difference whether you use &
or &&
.
It's mostly about good habits (using &
could carry over to some situation where it would be wrong) and about writing code that is easy to grasp by simple pattern matching. A &
in there causes a millisecond pause while one considers whether it possibly could be a bit-level thing.
On the third hand, you can't use keyword and
, which you ¹can use in ordinary C++ code.
Notes:
¹ With Visual C++ you can use and
via a forced include of <iso646.h>
.
According to the C99 standard, the expressions used in the preprocessor are constant expressions as defined by the C language itself, and are evaluated using the same engine. Therefore, &&
is a logical and operator that short circuits based on its LHS, and &
is a bitwise operator with no predefined order of evaluation.
In practical terms, when used with defined()
as you are, there is no difference between the two. However, the following would show a difference:
#define A 2
#define B 5
#if (A && B)
printf("A && B\n");
#endif
#if (A & B)
printf("A & B"\n);
#endif
In this case, A && B
will be output, but not A & B
(since the result of that bitwise-and is 0
)
I would like to add to the previous answers that it can actually matter a lot in a situation like this:
#define A 0
#define B 21
#if (A != 0) && (42 / A == B)
/* ... */
#endif
Here, if A == 0
, the compiler will not break. Writing (A != 0) & (42 / A == B)
will make the compiler complain about a division by zero.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With