This question came to mind after seeing this simple piece of code:
if (!x%y)
{
// do something
}
Maybe it's the influence of early C books (K&R?), but isn't the following always preferred, if not as cute?
if (x%y != 0)
{
// do something
}
This quote answers your question.
"Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it." – Brian W. Kernighan
Are you sure about that code? !x%y
means (!x)%y
because !
binds tighter than %
.
(For that reason alone, I would prefer x % y != 0
.)
First of all, props to everyone noticing that (!x%y)
is not equivalent to (!(x%y))
, but more importantly, neither of them is equivalent to:
if (x % y != 0)
which has a much nicer form:
if (x % y)
Personally I try not to write ==0
when it can be replaced by use of !
without introducing excessive parentheses, and I absolutely never use !=0
, but this is a discussion that will start flamewars. :-)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With