Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Unix Timestamp: Difference between using ToUnixTimeMilliseconds and TimeSpan.TotalMilliseconds

I am converting a DateTime to Unix time. As I understand these two ways should return teh same result.

Option 1

DateTime dtfoo = new DateTime(2010, 10, 20);
DateTimeOffset dtfoo2 = new DateTimeOffset(dtfoo).ToUniversalTime();
long afoo = dtfoo2.ToUnixTimeMilliseconds();

Option 2

DateTime dtfoo = new DateTime(2010, 10, 20);
DateTimeOffset dtfoo2 = new DateTimeOffset(dtfoo).ToUniversalTime();
long afoo = (Int64)(dtfoo2.Subtract(new DateTime(1970, 1, 1))).TotalMilliseconds;

Option 1 returns 1287525600000 and Option2 returns 1287529200000.

Why am I getting different results?

like image 948
user9923760 Avatar asked Nov 15 '18 21:11

user9923760


People also ask

Is Epoch and Unix timestamp same?

In computing, Unix time (also known as Epoch time, Posix time, seconds since the Epoch, Unix timestamp or UNIX Epoch time) is a system for describing a point in time. It is the number of seconds that have elapsed since the Unix epoch, excluding leap seconds. The Unix epoch is 00:00:00 UTC on 1 January 1970.

Are Unix timestamps seconds or milliseconds?

Unix is an operating system originally developed in the 1960s. Unix time is a way of representing a timestamp by representing the time as the number of seconds since January 1st, 1970 at 00:00:00 UTC.

Are Unix timestamps always UTC?

A few things you should know about Unix timestamps:Unix timestamps are always based on UTC (otherwise known as GMT). It is illogical to think of a Unix timestamp as being in any particular time zone. Unix timestamps do not account for leap seconds.

How long are Unix timestamps?

UNIX data represents different points in time as signed integers, traditionally of 32 bits, by encoding the UNIX timestamp. Because it uses 32 bits, UNIX time can only cover approximately 136 years in total.


1 Answers

Notice: I am in UTC+9, and the root of your issues come from timezone offsets, so understand that the unix times I see may differ slightly from your own.


The difference lies in how you're handling your date objects. I'm assuming from the resultant difference that your timezone is CET (or you were using rextester, which I believe is in Germany).

Consider the following code:

var dtfoo = new DateTime(2010, 10, 20);
var dtfoo2 = new DateTimeOffset(dtfoo);
var dtfoo3 = dtfoo2.ToUniversalTime();
  1. The first line creates a DateTime with a DateTimeKind of "Unspecified".
  2. The second line creates a DateTimeOffset object from this. Because the DateTimeKind is Unspecified, the system time offset from UTC is used.
  3. The third line converts this date into UTC.

Quoting the documentation for #2:

If the value of DateTime.Kind is DateTimeKind.Local or DateTimeKind.Unspecified, the DateTime property of the new instance is set equal to dateTime, and the Offset property is set equal to the offset of the local system's current time zone.

Now let's write out the roundtrip format date string for 1-3:

2010-10-20T00:00:00.0000000
2010-10-20T00:00:00.0000000+09:00
2010-10-19T15:00:00.0000000+00:00

I'm in UTC+9, so the DateTimeOffset was rightly created with an offset of +9h. Converting that to universal takes us to 3pm on the 19th. Unfortunately, this causes the output of .ToUnixTimeMilliseconds() to be 1287500400000, which is 2010-10-19T15:00:00Z. The value has become dependent on the machine's timezone.

So, now let's take a look at your second example:

DateTime dtfoo = new DateTime(2010, 10, 20);
DateTimeOffset dtfoo2 = new DateTimeOffset(dtfoo).ToUniversalTime();
long afoo = (Int64)(dtfoo2.Subtract(new DateTime(1970, 1, 1))).TotalMilliseconds;

OK, let's split this into the different parts so that we can see what time the system thinks they represent (remember that I'm in UTC+9):

  1. new DateTime(2010, 10, 20).ToString("o") - 2010-10-20T00:00:00.0000000
  2. new DateTimeOffset(dtfoo).ToString("o") - 2010-10-20T00:00:00.0000000+09:00
  3. new DateTimeOffset(dtfoo).ToUniversalTime() - 2010-10-19T15:00:00.0000000+00:00
  4. new DateTime(1970, 1, 1).ToString("o") - 1970-01-01T00:00:00.0000000

So you're effectively performing this calculation:

(DateTimeOffset.Parse("2010-10-19T15:00:00.0000000+00:00") - DateTime.Parse("1970-01-01T00:00:00.0000000")).TotalMilliseconds

This outputs 1287532800000, which equates to 2010-10-20T00:00:00Z. This gives you the correct result because of how the subtraction is done:

  1. The DateTime is implicitly cast to a DateTimeOffset, equivalent to new DateTimeOffset(DateTime.Parse("1970-01-01T00:00:00.000000")) - this means that both input dates have been through the same timezone changes.
  2. The two dates for subtraction are both converted to DateTime objects by calling the DateTimeOffset's .UtcDateTime property.

So how can we fix your original example? We can take the local timezone offset out of the equation by specifying the offset when constructing the DateTimeOffset:

DateTime dtfoo = new DateTime(2010, 10, 20);
DateTimeOffset dtfoo2 = new DateTimeOffset(dtfoo, TimeSpan.Zero).ToUniversalTime();
long afoo = dtfoo2.ToUnixTimeMilliseconds();

This now gives us the same value as we got in the previous test: 1287532800000. If we simplify this by using DateTimeOffset.Parse we should confirm that we're on the right track:

Console.WriteLine(DateTimeOffset.Parse("2010-10-20T00:00:00Z").ToUnixTimeMilliseconds());

And we can see that this also outputs 1287532800000.

So, to conclude, your issue stems from how the DateTimeOffset(datetime) constructor handles dates with DateTimeKind Unspecified or Local. It skews your resulting universal time depending on your machine's timezone. This leads to an incorrect unix time offset. To solve it simply create your DateTimeOffset in one of the ways I have described above.

like image 102
DiplomacyNotWar Avatar answered Oct 05 '22 23:10

DiplomacyNotWar