Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is DateTime based on Ticks rather than Milliseconds?

Tags:

Why is the minimum resolution of a DateTime based on Ticks (100-nanosecond units) rather than on Milliseconds?

like image 639
mas Avatar asked Jan 19 '13 14:01

mas


People also ask

Are ticks milliseconds?

A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond (see TicksPerMillisecond) and 10 million ticks in a second.

What is ticks in TimeSpan?

The smallest unit of time is the tick, which is equal to 100 nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond. The value of the Ticks property can be negative or positive to represent a negative or positive time interval.

How do I get microseconds in C#?

To convert a number of ticks to microseconds, just use: long microseconds = ticks / (TimeSpan. TicksPerMillisecond / 1000);

How do you convert ticks to days?

Therefore, in order to calculate the number of days from the number of ticks (rounded to nearest whole numbers), I first calculate the number of seconds by multiplying by ten million, and then multiplying that by the number of seconds in a day (60 seconds in minute, 60 minutes in hour, 24 hours in day).


1 Answers

  • TimeSpan and DateTime use the same Ticks making operations like adding a TimeSpan to a DateTime trivial.
  • More precision is good. Mainly useful for TimeSpan, but above reason transfers that to DateTime.

    For example StopWatch measures short time intervals often shorter than a millisecond. It can return a TimeSpan.
    In one of my projects I used TimeSpan to address audio samples. 100ns is short enough for that, milliseconds wouldn't be.

  • Even using milliseconds ticks you need an Int64 to represent DateTime. But then you're wasting most of the range, since years outside 0 to 9999 aren't really useful. So they chose ticks as small as possible while allowing DateTime to represent the year 9999.

    There are about 261.5 ticks with 100ns. Since DateTime needs two bits for timezone related tagging, 100ns ticks are the smallest power-of-ten interval that fits an Int64.

So using longer ticks would decrease precision, without gaining anything. Using shorter ticks wouldn't fit 64 bits. => 100ns is the optimal value given the constraints.

like image 178
CodesInChaos Avatar answered Dec 25 '22 13:12

CodesInChaos