Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does increasing timer resolution via timeBeginPeriod impact power consumption?

Tags:

c#

timer

winapi

I am currently writing an application in C# where I need to fire a timer approx. every 5 milliseconds. From some research it appears the best way to do this involves p/invoking timeBeginPeriod(...) to change the resolution of the system timer. It works well enough in my sample code.

I found an interesting warning about using this function on Larry Osterman's MSDN Blog in this entry:

Adam: calling timeBeginPeriod increases the accuracy of GetTickCount as well.

using timeBeginPeriod is a hideously bad idea in general - we've been actively removing all of the uses of it in Windows because of the power consumption consequences associated with using it.

There are better ways of ensuring that your thread runs in a timely fashion.

Does anyone know exactly why this occurs, or what those "better ways" (which are unspecified in the thread) might be? How much extra power draw are we talking about?

like image 903
Chuu Avatar asked Sep 28 '11 22:09

Chuu


1 Answers

Because it causes more CPU usage. A good explanation is at Timers, Timer Resolution, and Development of Efficient Code.

like image 72
Jim Mischel Avatar answered Oct 05 '22 06:10

Jim Mischel