I need to compute the JavaScript getTime method in C#.
For simplicity, I chose a fixed date in UTC and compared the C#:
C#
DateTime e = new DateTime(2011, 12, 31, 0, 0, 0, DateTimeKind.Utc);
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
TimeSpan t = (e - s);
var x = t.TotalMilliseconds.ToString();
=> 1325289600000
and the JavaScript results:
JavaScript
var d = new Date(2011, 12, 31, 0, 0, 0)
var utcDate = new Date(d.getUTCFullYear(), d.getUTCMonth(), d.getUTCDate(), d.getUTCHours(), d.getUTCMinutes(), d.getUTCSeconds());
utcDate.getTime()
=> 1327960800000
Any hints on what I'm doing wrong?
Thanks!
Javascript months are zero-based.12 means January of next year.
You want 11.
If you meant for the input to be at UTC, you should be doing this instead:
var ts = Date.UTC(2011,11,31,0,0,0);
As SLaks pointed out, months run 0-11, but even then - you must initialize the date as UTC if you want the response in UTC. In your code, you were initializing a local date, and then converting it to UTC. The result would be different depending on the time zone of the computer where the code is running. With Date.UTC, you get back a timestamp - not a Date object, and it will be the same result regardless of where it runs.
From Chrome's debugging console:

This is the same value returned from your .NET code, which looks just fine, except I would return a long, not a string.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With