Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is "" == [null] true in JavaScript?

Tags:

javascript

I know JavaScript has lots of insane results with comparisons between types, though I don't fully understand why. Came across this one today.

Why does

"" == [null] 

evaluate to true in JavaScript?

Some more Javascript equality amusement, thanks to @Qantas:

  • Why does 2 == [2] in JavaScript?
  • Why is 0 == "" true in JavaScript
  • Why if([]) is validated while [] == false in javascript?
  • Why does !{}[true] evaluate to true in JavaScript?
like image 698
Scott Stafford Avatar asked Aug 27 '14 15:08

Scott Stafford


People also ask

IS null == false in JavaScript?

The value null represents the intentional absence of any object value. It is one of JavaScript's primitive values and is treated as falsy for boolean operations.

Is null equal to true in JavaScript?

Despite the fact that null is a falsy value (i.e. it evaluates to false if coerced to a boolean), it isn't considered loosely equal to any of the other falsy values in JavaScript. In fact, the only values that null is loosely equal to are undefined and itself.

Why null === undefined is false in JavaScript?

Both undefined and null are falsy by default. So == returns true. But when we use the strict equality operator (===) which checks both type and value, since undefined and null are of different types (from the typeof Operator section), the strict equality operator returns false.

Why is null == 0 false?

Comparisons convert null to a number, treating it as 0 . That's why (3) null >= 0 is true and (1) null > 0 is false. On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don't equal anything else. That's why (2) null == 0 is false.


2 Answers

The "Abstract Equality Comparison Algorithm" has many parts, but the important one here is this:

If Type(x) is either String or Number and Type(y) is Object, return the result of the comparison x == ToPrimitive(y).

(There's a mirror-image of that too.) So, because "" is a string and [null] is an object, we've got to first convert [null] to a string by calling ToPrimitive([null]). That's an internal operation described as follows, when it's asked to convert an Object instance to a primitive value:

Return a default value for the Object. The default value of an object is retrieved by calling the [[DefaultValue]] internal method of the object, passing the optional hint PreferredType. The behaviour of the [[DefaultValue]] internal method is defined by this specification for all native ECMAScript objects in 8.12.8.

Now, the [[DefaultValue]] internal operation will call .toString() on the object and return that value. Try [null].toString() in your browser console:

> [null].toString() "" 

And there you have it.

Edit: And why is [null].toString() an empty string? Because the .toString() operation on Array instances always just calls .join(), and that always yields an empty string for null and undefined values. Thus an array of one null ends up as just a single empty string.

like image 108
Pointy Avatar answered Sep 28 '22 04:09

Pointy


It's according to the arcane type-conversion rules of Javascript. Rule #8:

If Type(x) is either String or Number and Type(y) is Object, return the result of the comparison x == ToPrimitive(y).

So the comparison is between x = "" and y = [null] converted to a string using ToPrimitive. Converting an array with one null element results in an empty string (because Array.toString() returns a comma-separated list of values), hence they evaluate to equal.

like image 22
McGarnagle Avatar answered Sep 28 '22 02:09

McGarnagle