When calculating the millisecond difference between two DateTime
objects I always seem to get a number returned where the decimal part of the number is the same as the integer component of the number. For example: 1235.1235
Why does this happen? Am I doing something wrong? Is this a quirk of the language or a limitation of the DateTime
granularity or something?
This can be demonstrated using the following code:
DateTime then = DateTime.Now;
Thread.Sleep(1234);
DateTime now = DateTime.Now;
TimeSpan taken = now - then;
string result = taken.TotalMilliseconds.ToString(CultureInfo.InvariantCulture);
//result = "1235.1235"
As commented by CodesInChaos:
DateTime` isn't accurate to this level of precision: see C# DateTime.Now precision
However - that doesn't quite explain this behavior.
There is a technical explanation for this, I cannot otherwise prove that it explains your observation. It certainly doesn't repro on my machine.
For starters you are looking a noise digits, the operating system clock isn't nearly accurate enough to give you sub-millisecond accuracy. So do make sure you never rely on their value to do anything important. If you want to measure an interval with high resolution then you must use Stopwatch instead.
The operating system clock is affected by updates from a time server. Most machines are setup to periodically contact time.windows.com
to re-calibrate the clock. This gets rid of clock drift, the machine hardware typically isn't good enough to keep time accurate to better than a second over a month. Low tolerance crystals are expensive and never completely without drift due to temperature and aging effects. And a leap second gets inserted once in a while to keep clocks synchronized with the slowing rotation of the planet. The last one crashed a lot of Linux machines, google "Linux leap second bug" for some fun reading.
What matters here is what happens when your machine gets a new update that requires it to adjust the clock. Windows does not suddenly jump the clock value, that causes major problems with programs that are paying attention to the clock and expect it to consistently increment in predictable amounts.
Instead, it adds a bit at a time with every clock tick increment. In effect making the clock run a bit slower or faster so it will gradually make up the difference and get accurate again. Perhaps you can see where that goes, the extra added microseconds is proportional to the length of the interval. So seeing the interval repeated in the noise digits is plausible.
The only real way to prove this theory is to pinvoke GetSystemTimeAdjustment(). It will return non-zero values when a system time adjustment is in progress. And then pinvoke SetSystemTimeAdjustment() to disable it and observe if that makes a difference in the value you see. Or just wait long enough until the clock has caught up so it doesn't get adjusted anymore.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With