Could somebody explain to me the difference between if(obj.x == undefined)
and if(typeof obj.x == 'undefined')
In some context the first one works fine, but in other I need to use the second way.
Questions
1 - What is the difference between the two condition?
2 - Is there a best practice?
The best practice is to not just check the truthiness but the strict equality
example
if (obj.x === undefined) {}
this use to be an issue because undefined
(a global property) use to be writable, as of 1.8.5 is is non-writable, providing you with a secure comparison in ES5 spec environments.
per MDN
The two would usually be equivalent if you replaced the equality operator ==
with the strict equality operator ===
. So obj.x === undefined
and typeof obj.x == "undefined"
are usually equivalent.
However, in pre-ECMAScript 5 environments (which still acount for the majority of web requests, in general), undefined
is a writable property of the global object, meaning that undefined
may be used as variable name or the global property may be assigned a different value. ECMAScript 5 makes the global property read-only, but even then, undefined
may still be used as variable name within a function, meaning that the typeof
check is always safer.
One further point in favour of typeof
is that it may be used to check for a variable that may not have been declared whereas a direct comparison will throw a ReferenceError
if the variable has not been declared. For example:
typeof foo == "undefined" // true
foo === undefined // ReferenceError
However, this is an unusual and not generally helpful thing to be doing.
The two are not equivalent tests because of the quite convoluted handling of special values by javascript. In the specific
undefined == null
is true, but typeof undefined
is "undefined"
while typeof null
is "object"
.
The rules for those special values are quite complex and IMO illogical, so I think there's no "general rule". What you may find are common forms, for example
var value = obj.x || default_value;
that can be used if you're sure that obj will never be undefined
or null
(because in that case an exception would be thrown) and assuming that 0
, NaN
or an empty string should be considered as if no value was provided (because they're all "logically false" values). An empty array or an empty javascript object instead are considered "logically true".
Why is it that way? Why does (null).x
throw an exception when null
according to typeof
is apparently an object and searching for a non-existent field in an object normally returns undefined
instead?
I've no idea.
I never tried to find a logic in all those strange rules. I'm not actually even 100% sure there's one.
My suggestion is just to study and experiment with them.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With