In JavaScript, why does:
"" < {}
evaluate to true
and
"" < []
evaluate to false
?
Because <
coerces its arguments. It prefers to compare numbers, but if the objects don't support a numeric primitive value, it does strings and you end up doing a lexical comparison. String({})
is "[object Object]"
, but String([])
is ""
. "" < "[object Object]"
is true
, but "" < ""
is false
.
All the gory details are in the spec (fair warning, the language is turgid to put it mildly).
From your comment:
If it tries numbers first, consider that
Number({})
isNaN
andNumber([])
is0
. ComparingNaN < 0
evaluates tofalse
, and so does0 < NaN
. Why are these results ignored?
I put that poorly originally when I said originally that "...it tries numbers first..." (I've updated that). It doesn't. It just prefers numbers. Again, full details in soporific detail in the spec (various links from the above), but basically:
<
operation does the abtract ToPrimitive
operation on its operands.[[DefaultValue]]
method with the "hint" "Number".[[DefaultValue]]
(hint = Number) invokes the valueOf
method of the objects and, if that method returns a primitive, returns it; if the result wasn't a primitive, [[DefaultValue]]
returns the result of toString
instead. The valueOf
method of objects (including arrays) returns the original object unchanged, so the result is [[DefaultValue]]
returning the result of toString
.<
operation sees that the operands are both strings and compares them lexically.Whereas if the operands were primitive numbers, ToPrimitive
would return the numbers unchanged, and the <
would compare them mathematically; if they were Number
instances (remember JavaScript has both primitive and object versions of numbers, strings, and booleans), Number#valueOf
would be called by [[DefaultValue]]
, and Number#valueOf
returns the primitive number value. And so <
would compare them mathmatically.
Fun, eh?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With