I have high precision dates stored in an SQL server, e.g.
2009-09-15 19:43:43.910
However when I convert that value into a DateTime the miliseconds value of the resulting DateTime value is 0:
reader["Timestamp"] = 15/09/2009 19:43:43.000
Having these DateTime values in precision down to milliseconds is very important to me - what is the best way of doing this?
UPDATE: This is the code that performs the conversion:
DateTime myDate = (DateTime)reader[Timestamp"];
There is nothing special about the SELECT
statement, in fact it is a SELECT *
- no fancy casts or anything
It appears that the DateTime object returned by the SqlDataReader
simply is not populated with the Millisecond value
We can use DATEPART() function to get the MILLISECOND part of the DateTime in Sql Server, here we need to specify datepart parameter of the DATEPART function as millisecond or mi .
The DateAdd function is what you are looking for. Use millisecond as the first parameter to the function, to tell it that you are adding milliseconds. Then use 1 as the second parameter, for the number of milliseconds to add.
I had this same problem and after some reading it turns out that when you retrieve the date as you were doing
DateTime myDate = (DateTime)reader["Timestamp"];
the SQLDataReader drops the milliseconds. However if you use the GetDateTime method of the SQLDataReader it returns a DateTime object which preserves the milliseconds:
reader.GetDateTime(reader.GetOrdinal("Timestamp"));
Maybe this (Difference between DateTime in c# and DateTime in SQL server) will help a little.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With