I'm making a C# win form app (VS2010, .NET4) that uses a timer, my interval is 1s, I track task manager and it seems that my memory usage (the value written in front of app name in process tab of task manager) increases by each interval! I do nothing special in timer tick event, just increase an integer variable and display it in a label.
Is it normal? Should I be concerned about this memory problem? I'm going to run this program in my server (through remote desktop), would it cause any problems to my server? Would it run out of memory? I use timer from VS toolbox.
Let's take the following example which updates a label every second with the current time:
var timer = new Timer
{
Interval = 1000,
};
timer.Tick += (s, evt) =>
{
label1.Text = DateTime.Now.ToLongTimeString();
};
timer.Start();
If you have code like this you shouldn't be worried about memory usage. The garbage collector could run at any time in order to free memory. It's just that you cannot determine when this happens.
Just for debugging, try forcing a garbage collection by running
GC.Collect();
Your memory usage should go back down to approximately where it was. By the way -- you can do this in the debugger by evaluating that expression in quick watch.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With