I'm confused but in javascript:
> new Date('2012-1-15') - new Date('2012-01-15') 21600000
Why is that? (21600000 / 1000 / 3600 == 6 hours)
The string format should be: YYYY-MM-DDTHH:mm:ss. sssZ , where: YYYY-MM-DD – is the date: year-month-day. The character "T" is used as the delimiter.
var currentdate = new Date(); var datetime = "Last Sync: " + currentdate. getDate() + "/"+(currentdate. getMonth()+1) + "/" + currentdate. getFullYear() + " @ " + currentdate.
The date format yyyy-mm-dd
(2012-01-15) is parsed as being a UTC date while yyyy-m-dd
(2012-1-15) is parsed as a local date. This is shown if you use .toString
on each.
> (new Date( '2012-01-15' )).toString() "Sat Jan 14 2012 16:00:00 GMT-0800 (Pacific Standard Time)" > (new Date( '2012-1-15' )).toString() "Sun Jan 15 2012 00:00:00 GMT-0800 (Pacific Standard Time)"
Note that I am in California, hence the Pacific Standard Time. If you are in a different time zone you will get different results.
When JavaScript parses dates it tries formats used in more areas (such as UTC) first before it tries localized date formats. The last part of the UTC date format is a timezone offset from GMT which is assumed to be 0 when it is missing (as it is in this example). To get the same date you would need the full UTC timestamp: 2012-01-15T00:00:00-08:00.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With