I came across this piece of code var timeStamp = 1 * new Date();
and to my surprise it returned value in milliseconds since 1970/01/01. This is equivalent to using .getTime()
method!
What's happening under the hood? Does Type Conversion concept work here, which basically converts new Date()
value into the milliseconds?
What's happening under the hood?
The short version:
Because it's being used in a math operation, the date is converted to a number, and when you convert dates to numbers, the number you get is the milliseconds-since-the-Epoch (e.g., getTime()
).
The long version:
The multiplication operator calls the abstract operation ToNumber
on its operands.
For objects like Date
s, that calls the abstract operation ToPrimitive
on the object, with the "preferred type" being "number".
For most types of objects (including Date
s), ToPrimitive
calls the abstract operation [[DefaultValue]]
, passing along the preferred type as the "hint".
[[DefaultValue]]
with hint = "number" calls valueOf
on the object. (valueOf
is a real method, unlike the abstract operations above.)
For Date
objects, valueOf
returns the "time value," the value you get from getTime
.
Side note: There's no reason I can think of to use var timeStamp = 1 * new Date()
rather than, say, var timeStamp = +new Date()
, which has the same effect. Or of course, on any modern engine (and the shim is trivial), var timeStamp = Date.now()
(more on Date.now
).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With