I am looking at JavaScript's number type system.
I'm using Chrome, When I evaluate 15--
for a number literal I get a ReferenceError
since it makes no sense to decrement a constant.
When I evaluate var x=10;x--;
as expected everything works.
Expectantly var a=Infinity;a--
evaluates to Infinity
, this all makes sense and is in accordance to the javascript language spec.
However to my surprise Infinity--
and Infinity++
evaluate to Infinity
unlike other literals.
This also happens for Number.POSITIVE_INFINITY
which is the same.
tl;dr :
Why does Infinity--
yield infinity as a result when 15--
and (new Number(15))--
yield a reference error?
Infinity
as used in your example is not actually a value but refers to the Infinity
property of the global object:
15.1 The Global Object
[...]
15.1.1 Value Properties of the Global Object
[...]
15.1.1.2 Infinity
The value ofInfinity
is+∞
(see 8.5). This property has the attributes { [[Writable]]:false
, [[Enumerable]]:false
, [[Configurable]]:false
}.
So, Infinity--
is the same as window.Infinity--
which is perfectly valid.
Because there is no such thing as a the number infinity, it is a concept, and thus in coding it isn't built as other constants but as an object like null or undefined but with some properties thrown in to make it behave nice with Math methods.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With