So, I've got this awesome program that is very useful:
static void Main(string[] args)
{
new Dictionary<int,int>(10000000);
while (true)
{
System.Threading.Thread.Sleep(1000);
}
}
This doesn't even produce any warnings from the compiler, which is surprising.
Running this allocates a chunk of memory. If I run several copies, I'll eventually get to a point where I can't start any more because I've run out of memory.
So what's going on here?
The garbage collector is non-deterministic, and responds to memory pressure. If nothing requires the memory, it might not collect for a while. It can't optimize away the new
, as that changes your code: the constructor could have side-effects. Also, in debug it is even more likely to decide not to collect.
In a release/optimized build, I would expect this to collect at some point when there is a good reason to. There is also GC.Collect
, but that should generally be avoided except for extreme scenarios or certain profiling demands.
As a "why" - there is a difference in the GC behaviour between GC "generations"; and you have some big arrays on the "large object heap" (LOH). This LOH is pretty expensive to keep checking, which may explain further why it is so reluctant.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With