I use an asp.net [WebMethod] to push a .net object back to the Ajax call on a browser.
One of the properties of the object is of a DateTime type.
When it arrives at the browser the time is seven hours before the time that is stored in the SQL Server.
Okay, so my browser is in Peru (GMT-5) and the server is in Germany (currently GMT+2), that's where the 7 hours come from.
As a fix I send the UTC offset on the client with the Ajax request
d = new Date();
d.getTimezoneOffset();
then on the server I figure out the offset there:
// get a local time zone info
TimeZoneInfo tz = TimeZoneInfo.Local;
// get it in hours
int offset = tz.BaseUtcOffset.Hours;
// add one hour if we are in daylight savings
if (tz.IsDaylightSavingTime(DateTime.Now))
{
offset++;
}
Now I can fix the time field in my object before it is send to the browser.
My real question is, how does the serializer know about the 7 hours?
The http request doesn't include any time information.
Do I ask too much if I want the exact time as stored in the database?
Update:
Here's an example, the date in the database is: 2009-Oct-15 22:00
There is no TimeZone information attached to that.
When I call my WebMethod on my dev machine where client and server are obviously in the same time zone, the JSON from the server is:
{"d":{"TheDate":"\/Date(1255662000000)\/"}}
The JSON from the remote server in Germany is:
{"d":{"TheDate":"\/Date(1255636800000)\/"}}
There is a difference of 7 hours in the JSON as seen in Firebug. At this point there is no JavaScript involved yet.
One idea I had is that asp.net attaches a TimeZone to a session but that doesn't seem to be the case.
To answer the OPs question, the timezone information is implicit in the conversion to the JSON /Date()/ format, because it is relative to UTC. For example, on my server here in NY, if I return a DateTime.Parse("1/1/1970"), it returns /Date(18000000)/, or, 5 hours (we're in DST now), which is the number of seconds since 1/1/1970 UTC, since the conversion says, "hey, it's 1/1/1970 00:00:00 here in NY, so it must be 1/1/70 05:00:00 back over in Greenwich."
Now, if a client in California received this date notation, and simply instantiates a JavaScript date from the milliseconds (e.g. new Date(18000000)), the browser will say, "hey, here is a date object, which I know is relative to UTC, and I know I am 8 hours from Greenwich, so it must be 12/31/1969 21:00:00."
So this is a pretty clever way to deal with time, so that it is "correct" in all time zones, and such that all localization is handled by the user's browser. Unfortunately, we are often dealing with just a raw date that we don't want to be time zone relative (say, a birthday). If we need to keep the date the same, there are two ways that I know of.
The first, as you have done above, is to adjust the time (although I think you need to do it at the browser, too, if you want it to work in any time zone).
The other way would be to return it as a string, formatted already. This is the method I normally employ, but I am normally working with US clients (so I can return MM/DD/YYYY, for example, and they don't get mad at me for being American).
To avoid weird bugs and having to deal with these kinds of issues, you should always deal in UTC and convert to local time at the last possible moment.
How are you examining the DateTime once it arrives in the browser? Are you sure the raw serialized format is not including the offset as part of the DateTime object? In which case, it could reconstitute at the other end in local time
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With