Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

C# DateTime.Now precision

I just ran into some unexpected behavior with DateTime.UtcNow while doing some unit tests. It appears that when you call DateTime.Now/UtcNow in rapid succession, it seems to give you back the same value for a longer-than-expected interval of time, rather than capturing more precise millisecond increments.

I know there is a Stopwatch class that would be better suited for doing precise time measurements, but I was curious if someone could explain this behavior in DateTime? Is there an official precision documented for DateTime.Now (for example, precise to within 50 ms?)? Why would DateTime.Now be made less precise than what most CPU clocks could handle? Maybe it's just designed for the lowest common denominator CPU?

public static void Main(string[] args) {     var stopwatch = new Stopwatch();     stopwatch.Start();     for (int i=0; i<1000; i++)     {         var now = DateTime.Now;         Console.WriteLine(string.Format(             "Ticks: {0}\tMilliseconds: {1}", now.Ticks, now.Millisecond));     }      stopwatch.Stop();     Console.WriteLine("Stopwatch.ElapsedMilliseconds: {0}",         stopwatch.ElapsedMilliseconds);      Console.ReadLine(); } 
like image 985
Andy White Avatar asked Jan 26 '10 22:01

Andy White


People also ask

Bahasa C digunakan untuk apa?

Meskipun C dibuat untuk memprogram sistem dan jaringan komputer namun bahasa ini juga sering digunakan dalam mengembangkan software aplikasi. C juga banyak dipakai oleh berbagai jenis platform sistem operasi dan arsitektur komputer, bahkan terdapat beberepa compiler yang sangat populer telah tersedia.

C dalam Latin berapa?

C adalah huruf ketiga dalam alfabet Latin. Dalam bahasa Indonesia, huruf ini disebut ce (dibaca [tʃe]).

Bahasa C dibuat pertama kali oleh siapa dan tahun berapa?

Bahasa pemrograman C ini dikembangkan antara tahun 1969 – 1972 oleh Dennis Ritchie. Yang kemudian dipakai untuk menulis ulang sistem operasi UNIX. Selain untuk mengembangkan UNIX, bahasa C juga dirilis sebagai bahasa pemrograman umum.


2 Answers

Why would DateTime.Now be made less precise than what most CPU clocks could handle?

A good clock should be both precise and accurate; those are different. As the old joke goes, a stopped clock is exactly accurate twice a day, a clock a minute slow is never accurate at any time. But the clock a minute slow is always precise to the nearest minute, whereas a stopped clock has no useful precision at all.

Why should the DateTime be precise to, say a microsecond when it cannot possibly be accurate to the microsecond? Most people do not have any source for official time signals that are accurate to the microsecond. Therefore giving six digits after the decimal place of precision, the last five of which are garbage would be lying.

Remember, the purpose of DateTime is to represent a date and time. High-precision timings is not at all the purpose of DateTime; as you note, that's the purpose of StopWatch. The purpose of DateTime is to represent a date and time for purposes like displaying the current time to the user, computing the number of days until next Tuesday, and so on.

In short, "what time is it?" and "how long did that take?" are completely different questions; don't use a tool designed to answer one question to answer the other.

Thanks for the question; this will make a good blog article! :-)

like image 97
Eric Lippert Avatar answered Oct 08 '22 05:10

Eric Lippert


DateTime's precision is somewhat specific to the system it's being run on. The precision is related to the speed of a context switch, which tends to be around 15 or 16 ms. (On my system, it is actually about 14 ms from my testing, but I've seen some laptops where it's closer to 35-40 ms accuracy.)

Peter Bromberg wrote an article on high precision code timing in C#, which discusses this.

like image 36
Reed Copsey Avatar answered Oct 08 '22 06:10

Reed Copsey