Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Does 'timer' take more CPU power?

I've used timer and I found it very helpful while taking decisions in seconds or milliseconds. Now I have strong feelings that continuous running of timer gradually increases consumption of processor cycles.

I've created an application (C#) and used 'timer tick' to execute 'three' instructions per 1000 milliseconds (1 sec) and I noticed that after 5 minutes application was consuming 5% of CPU power and 10% after 10 minutes.

If this progress remains constant then what will happen after 4-5 hours if I run my application in background?

Should I avoid excessive use of timer?

private void currentTime_Tick(object sender, EventArgs e)
{
   label1.Text = DateTime.Now.ToString("HH:mm:ss tt");
   label2.Text = dt.AddSeconds(i).ToString("HH:mm:ss");
   i++;
}
like image 474
Muhammad Ali Dildar Avatar asked Oct 25 '11 19:10

Muhammad Ali Dildar


1 Answers

It seems to me that not the timer itself is the cause, but whatever instructions are called by it. Do you create objects in those instructions of yours, or call something that runs in a separate thread? Starting threads or allocating resources, and forgetting to close them can certainly lead to the behavior you described.

like image 158
vsz Avatar answered Oct 21 '22 08:10

vsz