Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Computing milliseconds since 1970 in C# yields different date than JavaScript

I need to compute the JavaScript getTime method in C#.

For simplicity, I chose a fixed date in UTC and compared the C#:

C#
DateTime e = new DateTime(2011, 12, 31, 0, 0, 0, DateTimeKind.Utc);
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
TimeSpan t = (e - s);
var x = t.TotalMilliseconds.ToString();
=> 1325289600000

and the JavaScript results:

JavaScript
var d = new Date(2011, 12, 31, 0, 0, 0)
var utcDate = new Date(d.getUTCFullYear(), d.getUTCMonth(), d.getUTCDate(), d.getUTCHours(), d.getUTCMinutes(), d.getUTCSeconds());
utcDate.getTime()
=> 1327960800000

Any hints on what I'm doing wrong?

Thanks!

like image 460
mo5470 Avatar asked Feb 27 '14 21:02

mo5470


2 Answers

Javascript months are zero-based.
12 means January of next year.

You want 11.

like image 81
SLaks Avatar answered Oct 19 '22 20:10

SLaks


If you meant for the input to be at UTC, you should be doing this instead:

var ts = Date.UTC(2011,11,31,0,0,0);

As SLaks pointed out, months run 0-11, but even then - you must initialize the date as UTC if you want the response in UTC. In your code, you were initializing a local date, and then converting it to UTC. The result would be different depending on the time zone of the computer where the code is running. With Date.UTC, you get back a timestamp - not a Date object, and it will be the same result regardless of where it runs.

From Chrome's debugging console:

Debug Output

This is the same value returned from your .NET code, which looks just fine, except I would return a long, not a string.

like image 38
Matt Johnson-Pint Avatar answered Oct 19 '22 20:10

Matt Johnson-Pint