This is based on Computing milliseconds since 1970 in C# yields different date than JavaScript and C# version of Javascript Date.getTime().
For all of these calculations, assume they are being done in Central Standard Time, so 6 hours behind UTC (this offset will come up again later).
I understand that JavaScript Date
objects are based on the Unix Epoch (Midnight on Jan 1, 1970). So, if I do:
//remember that JS months are 0-indexed, so February == 1
var d = new Date(2014,1,28);
d.getTime();
My output will be:
1393567200000
Which represents the number of milliseconds since the Unix Epoch. That's all well and good. In the linked questions, people were asking about translating this functionality into C# and the "naive" implementation usually looks something like this:
//the date of interest in UTC
DateTime e = new DateTime(2014, 2, 28, 0, 0, 0, DateTimeKind.Utc);
//the Unix Epoch
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
//the difference between the two
TimeSpan t = (e - s);
var x = t.TotalMilliseconds;
Console.WriteLine(x);
Which produces output:
1393545600000
That's a difference of 21,600,000 milliseconds, or 6 hours: the exact offset from UTC for the time zone in which these calculations were done.
To get the C# implementation to match the JavaScript, this is the implemenation:
//DateTimeKind.Unspecified
DateTime st=new DateTime(1970,1,1);
//DateTimeKind.Unspecified
DateTime e = new DateTime(2014,2,28);
//translate e to UTC, but leave st as is
TimeSpan t= (e.ToUniversalTime()-st);
var x = t.TotalMilliseconds;
Console.WriteLine(x);
Which will give me output matching the JavaScript output:
1393567200000
What I have yet to find is an explanation for why we leave the DateTime
representing the Unix Epoch with a DateTimeKind
of Unspecified
to be able to match JavaScript. Shouldn't we get the correct result using DateTimeKind.Utc
? What detail am I not understanding? This is a purely academic question for me, I'm just curious about why this works this way.
As you correctly point out, .getTime()
returns "the number of milliseconds since 1 January 1970 00:00:00 UTC."
Which means that .getTime
is (as you noticed) including the offset from UTC in the calculation.
In order to make the C# code reflect this, the time you're subtracting from must include time zone information, while 1 January 1970 00:00:00 must be a UTC time.
This might be easier to understand with a few examples. Given:
DateTime e = new DateTime(2014, 2, 28, 0, 0, 0);
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0);
e - s
is incorrect because s
is not a UTC time.e.ToUniversalTime() - s.ToUniversalTime()
is incorrect because e
no longer includes the offset from UTC (like the calculation in JavaScript does)e.ToUniversalTime() - s
is correct because we're using the UTC time and the time we're subtracting includes the offset from UTC.This was easier for me to see when I dealt with DateTime.Ticks
directly:
e.Ticks // 635291424000000000
s.Ticks // 621355968000000000
e.Ticks - s.Ticks // 13935456000000000 ("naive" implementation)
e.ToUniversalTime().Ticks - s.Ticks // 13935636000000000 (correct output)
Again, the last example meets all of our requirements. The Unix epoch is in UTC, while the time we're dealing with still has its original offset.
I understand that JavaScript Date objects are based on the Unix Epoch (Midnight on Jan 1, 1970).
Yes, they are. Internally, it's just a number of milliseconds from the epoch. But when you call the date constructor, or look at the output from .toString()
, it is using the local time of where the code is running.
If you want the input to be specified in UTC, then you have to use a different incantation:
var ts = Date.UTC(2014,1,28); // returns a numeric timestamp, not a Date object
var dt = new Date(ts); // if you want a date object
var s = dt.toUTCString(); // if you want the output to be in UTC
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With