I've used timer and I found it very helpful while taking decisions in seconds or milliseconds. Now I have strong feelings that continuous running of timer gradually increases consumption of processor cycles.
I've created an application (C#) and used 'timer tick' to execute 'three' instructions per 1000 milliseconds (1 sec) and I noticed that after 5 minutes application was consuming 5% of CPU power and 10% after 10 minutes.
If this progress remains constant then what will happen after 4-5 hours if I run my application in background?
Should I avoid excessive use of timer?
private void currentTime_Tick(object sender, EventArgs e)
{
label1.Text = DateTime.Now.ToString("HH:mm:ss tt");
label2.Text = dt.AddSeconds(i).ToString("HH:mm:ss");
i++;
}
It seems to me that not the timer itself is the cause, but whatever instructions are called by it. Do you create objects in those instructions of yours, or call something that runs in a separate thread? Starting threads or allocating resources, and forgetting to close them can certainly lead to the behavior you described.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With