Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Timer increases memory usage in C# app

Tags:

c#

timer

I'm making a C# win form app (VS2010, .NET4) that uses a timer, my interval is 1s, I track task manager and it seems that my memory usage (the value written in front of app name in process tab of task manager) increases by each interval! I do nothing special in timer tick event, just increase an integer variable and display it in a label.

Is it normal? Should I be concerned about this memory problem? I'm going to run this program in my server (through remote desktop), would it cause any problems to my server? Would it run out of memory? I use timer from VS toolbox.

like image 806
Ali_dotNet Avatar asked Jul 07 '11 17:07

Ali_dotNet


2 Answers

Let's take the following example which updates a label every second with the current time:

var timer = new Timer
{
    Interval = 1000,
};
timer.Tick += (s, evt) =>
{
    label1.Text = DateTime.Now.ToLongTimeString();
};
timer.Start();

If you have code like this you shouldn't be worried about memory usage. The garbage collector could run at any time in order to free memory. It's just that you cannot determine when this happens.

like image 102
Darin Dimitrov Avatar answered Sep 26 '22 21:09

Darin Dimitrov


Just for debugging, try forcing a garbage collection by running

GC.Collect();

Your memory usage should go back down to approximately where it was. By the way -- you can do this in the debugger by evaluating that expression in quick watch.

like image 23
Ed Bayiates Avatar answered Sep 22 '22 21:09

Ed Bayiates