Guys, I DO aware of Why does 10..toString() work, but 10.toString() does not? question existence, but the thing is that it doesn't provide the formal explanation.
The specification's interpretation of the . character in that particular position is that it will be a decimal. This is defined by the numeric literal syntax of ECMAScript.
Without reference to a standard isn't trustable enough
I subconsciously understand that
42..toString()
is treated by a parser as a 42.
number followed by a .toString()
call.
What I cannot understand is why an interpreter cannot realize that
42.toString()
is a 42
followed by a method call.
Is it just a drawback of modern JS interpreters or is it explicitly stated by ES5.1?
From ES5.1 the Numeric Literal is defined as (only significant part of definition):
NumericLiteral ::
DecimalLiteral
HexIntegerLiteral
DecimalLiteral ::
DecimalIntegerLiteral . DecimalDigits(opt) ExponentPart(opt)
. DecimalDigits ExponentPart(opt)
DecimalIntegerLiteral ExponentPart(opt)
The last rule is what I expect to be chosen by a parser.
UPD: to clarify, this question expects as an answer references to ES specification that state explicitly that interpreter must behave like it does
I believe the piece you're missing is this quote from section 7:
The source text is scanned from left to right, repeatedly taking the longest possible sequence of characters as the next input element.
Note "longest possible sequence of characters"; since "42." is a valid token (which is a kind of input element), it must be used rather than "42" and then ".".
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With