I am converting a DateTime to Unix time. As I understand these two ways should return teh same result.
Option 1
DateTime dtfoo = new DateTime(2010, 10, 20);
DateTimeOffset dtfoo2 = new DateTimeOffset(dtfoo).ToUniversalTime();
long afoo = dtfoo2.ToUnixTimeMilliseconds();
Option 2
DateTime dtfoo = new DateTime(2010, 10, 20);
DateTimeOffset dtfoo2 = new DateTimeOffset(dtfoo).ToUniversalTime();
long afoo = (Int64)(dtfoo2.Subtract(new DateTime(1970, 1, 1))).TotalMilliseconds;
Option 1 returns 1287525600000 and Option2 returns 1287529200000.
Why am I getting different results?
In computing, Unix time (also known as Epoch time, Posix time, seconds since the Epoch, Unix timestamp or UNIX Epoch time) is a system for describing a point in time. It is the number of seconds that have elapsed since the Unix epoch, excluding leap seconds. The Unix epoch is 00:00:00 UTC on 1 January 1970.
Unix is an operating system originally developed in the 1960s. Unix time is a way of representing a timestamp by representing the time as the number of seconds since January 1st, 1970 at 00:00:00 UTC.
A few things you should know about Unix timestamps:Unix timestamps are always based on UTC (otherwise known as GMT). It is illogical to think of a Unix timestamp as being in any particular time zone. Unix timestamps do not account for leap seconds.
UNIX data represents different points in time as signed integers, traditionally of 32 bits, by encoding the UNIX timestamp. Because it uses 32 bits, UNIX time can only cover approximately 136 years in total.
Notice: I am in UTC+9, and the root of your issues come from timezone offsets, so understand that the unix times I see may differ slightly from your own.
The difference lies in how you're handling your date objects. I'm assuming from the resultant difference that your timezone is CET (or you were using rextester, which I believe is in Germany).
Consider the following code:
var dtfoo = new DateTime(2010, 10, 20);
var dtfoo2 = new DateTimeOffset(dtfoo);
var dtfoo3 = dtfoo2.ToUniversalTime();
DateTime
with a DateTimeKind
of "Unspecified".DateTimeOffset
object from this. Because the DateTimeKind is Unspecified, the system time offset from UTC is used.Quoting the documentation for #2:
If the value of DateTime.Kind is DateTimeKind.Local or DateTimeKind.Unspecified, the DateTime property of the new instance is set equal to dateTime, and the Offset property is set equal to the offset of the local system's current time zone.
Now let's write out the roundtrip format date string for 1-3:
2010-10-20T00:00:00.0000000
2010-10-20T00:00:00.0000000+09:00
2010-10-19T15:00:00.0000000+00:00
I'm in UTC+9, so the DateTimeOffset
was rightly created with an offset of +9h. Converting that to universal takes us to 3pm on the 19th. Unfortunately, this causes the output of .ToUnixTimeMilliseconds()
to be 1287500400000
, which is 2010-10-19T15:00:00Z
. The value has become dependent on the machine's timezone.
So, now let's take a look at your second example:
DateTime dtfoo = new DateTime(2010, 10, 20);
DateTimeOffset dtfoo2 = new DateTimeOffset(dtfoo).ToUniversalTime();
long afoo = (Int64)(dtfoo2.Subtract(new DateTime(1970, 1, 1))).TotalMilliseconds;
OK, let's split this into the different parts so that we can see what time the system thinks they represent (remember that I'm in UTC+9):
new DateTime(2010, 10, 20).ToString("o")
- 2010-10-20T00:00:00.0000000new DateTimeOffset(dtfoo).ToString("o")
- 2010-10-20T00:00:00.0000000+09:00new DateTimeOffset(dtfoo).ToUniversalTime()
- 2010-10-19T15:00:00.0000000+00:00new DateTime(1970, 1, 1).ToString("o")
- 1970-01-01T00:00:00.0000000So you're effectively performing this calculation:
(DateTimeOffset.Parse("2010-10-19T15:00:00.0000000+00:00") - DateTime.Parse("1970-01-01T00:00:00.0000000")).TotalMilliseconds
This outputs 1287532800000
, which equates to 2010-10-20T00:00:00Z. This gives you the correct result because of how the subtraction is done:
DateTime
is implicitly cast to a DateTimeOffset
, equivalent to new DateTimeOffset(DateTime.Parse("1970-01-01T00:00:00.000000"))
- this means that both input dates have been through the same timezone changes.DateTime
objects by calling the DateTimeOffset
's .UtcDateTime
property.So how can we fix your original example? We can take the local timezone offset out of the equation by specifying the offset when constructing the DateTimeOffset
:
DateTime dtfoo = new DateTime(2010, 10, 20);
DateTimeOffset dtfoo2 = new DateTimeOffset(dtfoo, TimeSpan.Zero).ToUniversalTime();
long afoo = dtfoo2.ToUnixTimeMilliseconds();
This now gives us the same value as we got in the previous test: 1287532800000
. If we simplify this by using DateTimeOffset.Parse
we should confirm that we're on the right track:
Console.WriteLine(DateTimeOffset.Parse("2010-10-20T00:00:00Z").ToUnixTimeMilliseconds());
And we can see that this also outputs 1287532800000
.
So, to conclude, your issue stems from how the DateTimeOffset(datetime)
constructor handles dates with DateTimeKind Unspecified
or Local
. It skews your resulting universal time depending on your machine's timezone. This leads to an incorrect unix time offset. To solve it simply create your DateTimeOffset
in one of the ways I have described above.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With