I'm asking this off the back of a Jeff Atwood tweet which shows the following outcomes of null / zero comparison in javascript:
I've seen this before and as amusing as it is, I'm wondering if there is actually logic or reasoning behind the behaviour?
0 == null
is never true
. With "loose comparison" null
is only equal to itself or to undefined
.
However, the relational operator(s) converts its operands to numbers first, if any of those is a number. So, since 0
is a number, null
is converted to a number. And the mathematical value to null
is 0
. So you end up comparing
0 > 0 // nope
0 >= 0 // yes
0 == null // nope, null is only equal to null and undefined
0 <= 0 // yes
0 < 0 // nope
These rules are all defined in the ECMAScript specification (whether they make sense or not is a different question).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With