Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How does SqlDateTime do its precision reduction

Tags:

c#

sqldatetime

Consider the following program:

DateTime dateTime = new DateTime(634546165144647370);
SqlDateTime sqlDateTime = new SqlDateTime(dateTime);

Console.WriteLine("dateTime.TimeOfDay    = " + dateTime.TimeOfDay);
Console.WriteLine("sqlDateTime.TimeOfDay = " + sqlDateTime.Value.TimeOfDay);
Console.ReadLine();

That will have the following output:

dateTime.TimeOfDay    = 10:21:54.4647370  
sqlDateTime.TimeOfDay = 10:21:54.4630000

What is odd to me is that .464737 was rounded to .463. Shouldn't that have rounded to .464?

I assume I have not found a bug in .NET code, so the question is:

Why did it round to what it did? and how can I get client side rounding that will do what SqlServer is going to do?

As a side note, I saved this date time to a SQL Server database (in a DateTime column) and pulled it out again and it came out as 10:21:54.4670000. So I am really confused. (I thought SqlDateTime would match up with what SQL Server was going to do.)

Note: Because I am using OData I cannot use DateTime2 in SQL Server.

like image 267
Vaccano Avatar asked Oct 19 '11 16:10

Vaccano


1 Answers

SQL Server DATETIME has an accuracy of 3.33ms - therefore, you cannot get all possible values, and there's a good chance .464 was just such a value.

On SQL Server 2008, you could use DATETIME2 or TIME(x) datatypes which have an accuracy to 100ns - that should be plenty enough for "regular" use

like image 88
marc_s Avatar answered Sep 29 '22 20:09

marc_s