When using ToLocalTime(), how is local time determined? Is local time from the server or the client? My assumption would be the server running the application.
The ToLocalTime method converts a DateTime value from UTC to local time. To convert the time in any designated time zone to local time, use the TimeZoneInfo. ConvertTime method. The value returned by the conversion is a DateTime whose Kind property always returns Local.
You can get the time off the server and do this. DateTime myTimeGMT = ServerTime. ToUniversalTime();
In performing the conversion to local time, the method first converts the current DateTimeOffset object's date and time to Coordinated Universal Time (UTC) by subtracting the offset from the time. It then converts the UTC date and time to local time by adding the local time zone offset.
Add the local time offset to the UTC time. For example, if your local time offset is -5:00, and if the UTC time is shown as 11:00, add -5 to 11. The time setting when adjusted for offset is 06:00 (6:00 A.M.).
It is the local time zone of the computer that code is running on. In an ASP.Net application, the code runs on the server - so that's the time zone that it will return.
The behavior of this function is actually dependent on the .Kind
property of the source value. From the MSDN link you gave:
DateTime
is converted to local time.DateTime
is assumed to be a UTC time, and the conversion is performed as if Kind were Utc.This is non-obvious behavior. You can read other related problems with the .net DateTime
class here and here.
A few other points:
ToLocalTime()
, in this case, executes on the server. Therefore the time is evaluated on the server, and it'll return the server time to the client.
It is the local time on the server.
Local here is the timezone of the machine that the function executes on.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With