When I try to Date.parse()
an integer or string 0
, it returns 946681200000, which translates to a date of:
Sat Jan 01 2000 00:00:00 GMT+0100 (CET)
I would assume that the parser interprets the single zero as a year 2000, but the specs say nothing about single-character year definition - both RFC 2822 and ISO 8601 require a four-character year in the string.
I would like to better understand how the string '0' is parsed into a Date, why is it accepted as a valid Date (should it not be NaN
or some such?) and why the year 2000 is chosen instead of for example 1900.
Update
After some trial & error, I discovered that the single number is in fact interpreted differently in different numeric ranges.
NaN
parse() The Date. parse() method parses a string representation of a date, and returns the number of milliseconds since January 1, 1970, 00:00:00 UTC or NaN if the string is unrecognized or, in some cases, contains illegal date values (e.g. 2015-02-31).
parse() parses a date string and returns the time difference since January 1, 1970. parse() returns the time difference in milliseconds.
the specs say nothing about single-character year definition
The spec says:
If the String does not conform to that format the function may fall back to any implementation-specific heuristics or implementation-specific date formats.
For V8 specifically, see this bug report on unpredictable results when called with a single number. You can also read the source directly (dateparser.cc, dateparser.h, dateparser-inl.h).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With