The test :
var d1 = new Date("2000-04-22T00:00:00+00:00");
var d2 = new Date(2000, 4, 22, 0, 0, 0, 0);
console.log("d1 = " + d1.getTime());
console.log("d2 = " + d2.getTime());
The result :
d1 = 956361600000
d2 = 958946400000
I think the difference comes from timezones, but I don't understand it at all!
What magical stuff produce this difference?
Your ISO date string has an explicit UTC time zone, so it's interpreted as UTC. The second date is constructed with numeric parameters for the date parts, and that means of building a Date implicitly uses the local time zone.
Also note that the "4" in the date string means "April", while the "4" in the second constructor means "May": when using the numeric constructor, months are numbered starting from zero, not one.
The largest problem is that months are 0 indexed in Javascript, in creating your second date instance, the 4 means May. Also, your string includes +00:00 for timezone purposes, which is UTC/GMT. Your second provides no timezone information, so it will be the local timezone.
If you want a UTC date constructed similar to how you had it, you can call:
var d2 = new Date(Date.UTC(2000, 3, 22, 0, 0, 0, 0));
I think this is best described in the documentation:
Note: Where Date is called as a constructor with more than one argument, the specifed arguments represent local time. If UTC is desired, use new Date(Date.UTC(...)) with the same arguments.
In other words, when you use the string constructor, you get UTC. When you create one using the constructor, you get your normal locale.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With