Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

System.Timers.Timer only gives maximum 64 frames per second

I have an application that uses a System.Timers.Timer object to raise events that are processed by the main form (Windows Forms, C#). My problem is that no matter how short I set the .Interval (even to 1 ms) I get a max of 64 times per second.

I know the Forms timer has a 55 ms accuracy limit, but this is the System.Timer variant, not the Forms one.

The application sits a 1% CPU, so it's definitely not CPU-bound. So all it's doing is:

  • Set the Timer to 1&nsp;ms
  • When the event fires, increment a _Count variable
  • Set it to 1&nsp;ms again and repeat

_Count gets incremented a maximum of 64 times a second even when there's no other work to do.

This is an "playback" application that has to replicate packets coming in with as little as 1-2 ms delay between them, so I need something that can reliably fire 1000 times a second or so (though I'd settle for 100 if I was CPU bound, I'm not).

Any thoughts?

like image 403
Dave Avatar asked Nov 22 '12 23:11

Dave


People also ask

What is the maximum number of timers that a process can have?

Windows 3.0 increased this to 32.

Does System timers timer run in a separate thread?

Timers. Timer raises the elapsed event, is it raised in an independent thread? Yes, they run in a different thread. The System.

How does C# timer work?

The Timer class in C# represents a Timer control that executes a code block at a specified interval of time repeatedly. For example, backing up a folder every 10 minutes, or writing to a log file every second. The method that needs to be executed is placed inside the event of the timer.

What is System threading timer?

The Timer class (in the System. Threading namespace) is effective to periodically run a task on a separate thread. It provides a way to execute methods at specified intervals. This class cannot be inherited. This class is particularly useful for developing console applications, where the System.


2 Answers

Try Multimedia Timers - they provide greatest accuracy possible for the hardware platform. These timers schedule events at a higher resolution than other timer services.

You will need following Win API functions to set timer resolution, start and stop timer:

[DllImport("winmm.dll")]
private static extern int timeGetDevCaps(ref TimerCaps caps, int sizeOfTimerCaps);

[DllImport("winmm.dll")]
private static extern int timeSetEvent(int delay, int resolution, TimeProc proc, int user, int mode);

[DllImport("winmm.dll")]
private static extern int timeKillEvent(int id);

You also need callback delegate:

delegate void TimeProc(int id, int msg, int user, int param1, int param2);

And timer capabilities structure

[StructLayout(LayoutKind.Sequential)]
public struct TimerCaps
{
    public int periodMin;
    public int periodMax;
}

Usage:

TimerCaps caps = new TimerCaps();
// provides min and max period 
timeGetDevCaps(ref caps, Marshal.SizeOf(caps));
int period = 1;
int resolution = 1;
int mode = 0; // 0 for periodic, 1 for single event
timeSetEvent(period, resolution, new TimeProc(TimerCallback), 0, mode);

And callback:

void TimerCallback(int id, int msg, int user, int param1, int param2)
{
    // occurs every 1 ms
}
like image 110
Sergey Berezovskiy Avatar answered Sep 17 '22 14:09

Sergey Berezovskiy


You can stick to your design. You only need to set the system interrupt frequency to run at its maximum frequency. In order to obtain this, you just have to execute the following code anywhere in your code:

#define TARGET_RESOLUTION 1         // 1-millisecond target resolution

TIMECAPS tc;
UINT     wTimerRes;

if (timeGetDevCaps(&tc, sizeof(TIMECAPS)) != TIMERR_NOERROR) 
{
    // Error; application can't continue.
}

wTimerRes = min(max(tc.wPeriodMin, TARGET_RESOLUTION), tc.wPeriodMax);
timeBeginPeriod(wTimerRes); 

This will force the systems interrupt period to run at maximum frequency. It is a system wide behavior, thus it may even be done in a separate process. Don't forget to use

MMRESULT timeEndPeriod(wTimerRes );

when done to release the resource and reset the interrupt period to default. See Multimedia Timers for details.

You must match each call to timeBeginPeriod with a call to timeEndPeriod, specifying the same minimum resolution in both calls. An application can make multiple timeBeginPeriod calls as long as each call is matched with a call to timeEndPeriod.

As a consequence, all timers (including your current design) will operate at higher frequency since the granularity of the timers will improve. A granularity of 1 ms can be obtained on most hardware.

Here is a list of interrupt periods obtained with various settings of wTimerRes for two different hardware setups (A+B):

ActualResolution (interrupt period) vs. setting of wTimerRes

It can easily be seen that the 1 ms is a theoretical value. ActualResolution is given in 100 ns units. 9,766 represents 0.9766 ms which is 1024 interrupts per second. (In fact it should be 0.9765625 which would be 9,7656.25 100 ns units, but that accuracy obviously does not fit into an integer and is therefore rounded by the system.)

It also becomes obvious that i.g. platform A does not really support all the range of periods returned by timeGetDevCaps (values ranging between wPeriodMinand wPeriodMin).

Summary: The multimedia timer interface can be used to modify the interrupt frequency system wide. As a consequence all timers will change their granularity. Also the system time update will change accordingly, it will increment more often and in smaller steps. But: The actual behavior depends on the underlaying hardware. This hardware dependency has gotten a lot smaller since the introduction of Windows 7 and Windows 8 since newer schemes for timing have been introduced.

like image 45
Arno Avatar answered Sep 16 '22 14:09

Arno