In Chrome and Firefox,
typeof foo
evalulates to 'undefined'
.
But
typeof (function() { return foo; })()
throws an error:
ReferenceError: foo is not defined
This destroys the notions that I have of susbstitutability of expressions! Until now, I knew of no conditions for which foo
and (function() { return foo; })()
are not the same.
Is this standard behavior? If so, it would be helpful to quote the relevant part of the ECMAScript standard.
EDIT:
Another example:
typeof (foo)
typeof (foo + 0)
I would have expect (foo)
and (foo + 0)
to throw an error.
But the first one has no error; the second one does.
typeof is perfectly fine to use, but not for general type checking. That's not its purpose. It can only distinguish between "object" , "function" , "undefined" and the primitives "boolean" , "number" and "string" .
typeof is a JavaScript keyword that will return the type of a variable when you call it. You can use this to validate function parameters or check if variables are defined. There are other uses as well. The typeof operator is useful because it is an easy way to check the type of a variable in your code.
The typeof operator returns a string indicating the type of the operand's value.
Typeof in JavaScript is an operator used for type checking and returns the data type of the operand passed to it. The operand can be any variable, function, or object whose type you want to find out using the typeof operator.
Basically, the typeof
operator checks whether a variable¹ is unresolvable and returns "undefined"
. That is, typeof
returns a defined value for undeclared variables¹ before reaching the GetValue
algorithm which throws for undeclared variables¹.
Quoting ECMAScript 5.1 § 11.4.3 The typeof Operator (emphasis added):
11.4.3 The typeof Operator
The production UnaryExpression :
typeof
UnaryExpression is evaluated as follows:
- Let val be the result of evaluating UnaryExpression.
If Type(val) is Reference, then
2.1. If IsUnresolvableReference(val) is
true
, return"undefined"
.2.2 Let val be GetValue(val).
Return a String determined by Type(val) according to Table 20.
In the other hand, the return statement -- like most operators and statements which read the value from identifier(s) -- will always call GetValue
which throws on unresolvable identifiers (undeclared variables). Quoting ECMAScript 5.1 § 8.7.1 GetValue (V) (emphasis added):
8.7.1 GetValue (V)
- If Type(V) is not Reference, return V.
- Let base be the result of calling GetBase(V).
- If IsUnresolvableReference(V), throw a
ReferenceError
exception.
Now, analyzing the code:
typeof (function() { return foo; })()
This code will instantiate a function object, execute it and only then typeof
will operate on the function's return value (function call takes precedence over the typeof
operator).
Hence, the code throws while evaluating the IIFE's return
statement, before the typeof
operation can be evaluated.
A similar but simpler example:
typeof (foo+1)
The addition is evaluated before typeof
. This will throw an error when the Addition Operator calls GetValue
on foo
, before typeof
comes into play.
Now:
typeof (foo)
Does not throw an error as the grouping operator (parentheses) does not "evaluate" anything per se, it just forces precedence. More specifically, the grouping operator does not call GetValue
. In the example above it returns an (unresolvable) Reference.
The annotated ES5.1 spec even adds a note about this:
NOTE This algorithm does not apply
GetValue
to the result of evaluating Expression. The principal motivation for this is so that operators such asdelete
andtypeof
may be applied to parenthesised expressions.
N.B. I've wrote this answer with the focus on providing a simple and understandable explanation, keeping the technical jargon to a minimum while still being sufficiently clear and providing the requested ECMAScript standard references, which I hope to be a helpful resource to developers who struggle with understanding the typeof
operator.
¹ The term "variable" is used for ease of understanding. A more correct term would be identifier, which can be introduced into a Lexical Environment not only through variable declarations, but also function declarations, formal parameters, calling a function (arguments
), with
/catch
blocks, assigning a property to the global object, let
and const
statements (ES6), and possibly a few other ways.
Is this standard behavior?
Yes. typeof doesn't throw an error because it just returns a value as specified. However, as other answers have said, the code fails when evaluating the operand.
If so, it would be helpful to quote the relevant part of the ECMAScript standard.
When evaluating the function expression, an attempt to resolve the value of foo (so that it can be returned) will call the internal GetValue method with argument foo. However, since foo hasn't been declared or otherwise created, a reference error is thrown.
In the case of:
typeof (foo)
"(" and ")" are punctuators, denoting a grouping, such as a (possibly empty) parameter list when calling a function like foo(a, b)
, or an expression to be evaluated, e.g. if (x < 0)
and so on.
In the case of typeof (foo)
they simply denote evaluating foo before applying the typeof operator. So foo, being a valid identifier, is passed to typeof, per link above, which attempts to resolve it, can't, determines it's an unresolveable reference, and returns the string "undefined"
.
In the case of:
typeof (foo + 0)
the brackets cause the expression foo + 0
to be evaluated first. When getting the value of foo, a reference error is thrown so typeof doesn't get to operate. Note that without the brackets:
typeof foo + 0 // undefined0
because of operator precedence: typeof foo
returns the string "undefined"
, so +
becomes addition operator because one of the arguments is a string, it does concatenation (the string version of addition, not the mathematic version), so 0
is converted to the string "0"
and concatenated to "undefined"
, resutling in the string "undefined0"
.
So any time the evaluation of an expression with an unresolveable reference is attempted (e.g. an undeclared or initialised variable) a reference error will be thrown, e.g.
typeof !foo
throws a reference error too because in order to work out what to pass to typeof, the expression must be evaluated. To apply the !
operator, the value of foo must be obtained and in attempting that, a reference error is thrown.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With