I'm wondering what the precision of the Timer class is in System.Timers, because it's a double (which would seem to indicate that you can have fractions of milliseconds). What is it?
Elapsed event every two seconds (2000 milliseconds), sets up an event handler for the event, and starts the timer. The event handler displays the value of the ElapsedEventArgs. SignalTime property each time it is raised.
The AutoReset property is used to configure the timer object to raise an elapsed event repeatedly at the specified interval of time defined by the Interval property. In case if we want to raise the Elapsed event only once after the specified interval has elapsed, then we need to set the AutoReset property to false .
Use of Timer Control A Timer control does not have a visual representation and works as a component in the background.
An Interval property is used to set or obtain the iteration interval in milliseconds to raise the timer control's elapsed event. According to the interval, a timer repeats the task.
Windows desktop OSes really aren't accurate below about 40ms. The OS simply isn't real time and therefore presents significant non-deterministic jitter. What that means is that while it may report values down to the millisecond or even smaller, you can't really count on those values to be really meaningful. So even if the Timer interval gets set to some sub-millisecond value, you can't rely on times between setting and firing to actually be what you said you wanted.
Add to this fact that the entire framework you're running under is non-deterministic (the GC could suspend you and do collection duing the time when the Timer should fire) and you end up with loads and loads of risk trying to do anything that is time critical.
A couple of years ago I found it to be accurate to about 16ms... but I unfortunately don't remember the details.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With