Below is an example comparing two JavaScript objects but I am confused by the returned values.
var i=new Object()
var j=new Object()
i==j       false
i!=j       true
i>=j true
i<=j  true
i>j false
i<j false
How are the values determined for the above? I am having trouble understanding.
Here are the reasons,
i==j false //Since both are referring two different objects
i!=j True  //Since both are referring two different objects
i>=j true  //For this, the both objects will be converted to primitive first,
           //so i.ToPrimitive() >= j.ToPrimitive() which will be 
           //evaluated to "[object Object]" >= "[object Object]" 
           //That is why result here is true.
i<=j true  //Similar to >= case
i>j false  //Similar to >= case
i<j false  //Similar to >= case
i<-j false //similar to >= case but before comparing "[object object]" will be negated 
           //and will become NaN. Comparing anything with NaN will be false 
           //as per the abstract equality comparison algorithm 
You mentioned i<-j will be evaluated to true. But that is wrong, it will be evaluated to false. See the reasons above.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With