Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I get .NET to garbage collect aggressively?

I have an application that is used in image processing, and I find myself typically allocating arrays in the 4000x4000 ushort size, as well as the occasional float and the like. Currently, the .NET framework tends to crash in this app apparently randomly, almost always with an out of memory error. 32mb is not a huge declaration, but if .NET is fragmenting memory, then it's very possible that such large continuous allocations aren't behaving as expected.

Is there a way to tell the garbage collector to be more aggressive, or to defrag memory (if that's the problem)? I realize that there's the GC.Collect and GC.WaitForPendingFinalizers calls, and I've sprinkled them pretty liberally through my code, but I'm still getting the errors. It may be because I'm calling dll routines that use native code a lot, but I'm not sure. I've gone over that C++ code, and make sure that any memory I declare I delete, but still I get these C# crashes, so I'm pretty sure it's not there. I wonder if the C++ calls could be interfering with the GC, making it leave behind memory because it once interacted with a native call-- is that possible? If so, can I turn that functionality off?

EDIT: Here is some very specific code that will cause the crash. According to this SO question, I do not need to be disposing of the BitmapSource objects here. Here is the naive version, no GC.Collects in it. It generally crashes on iteration 4 to 10 of the undo procedure. This code replaces the constructor in a blank WPF project, since I'm using WPF. I do the wackiness with the bitmapsource because of the limitations I explained in my answer to @dthorpe below as well as the requirements listed in this SO question.

public partial class Window1 : Window {     public Window1() {         InitializeComponent();         //Attempts to create an OOM crash         //to do so, mimic minute croppings of an 'image' (ushort array), and then undoing the crops         int theRows = 4000, currRows;         int theColumns = 4000, currCols;         int theMaxChange = 30;         int i;         List<ushort[]> theList = new List<ushort[]>();//the list of images in the undo/redo stack         byte[] displayBuffer = null;//the buffer used as a bitmap source         BitmapSource theSource = null;         for (i = 0; i < theMaxChange; i++) {             currRows = theRows - i;             currCols = theColumns - i;             theList.Add(new ushort[(theRows - i) * (theColumns - i)]);             displayBuffer = new byte[theList[i].Length];             theSource = BitmapSource.Create(currCols, currRows,                     96, 96, PixelFormats.Gray8, null, displayBuffer,                     (currCols * PixelFormats.Gray8.BitsPerPixel + 7) / 8);             System.Console.WriteLine("Got to change " + i.ToString());             System.Threading.Thread.Sleep(100);         }         //should get here.  If not, then theMaxChange is too large.         //Now, go back up the undo stack.         for (i = theMaxChange - 1; i >= 0; i--) {             displayBuffer = new byte[theList[i].Length];             theSource = BitmapSource.Create((theColumns - i), (theRows - i),                     96, 96, PixelFormats.Gray8, null, displayBuffer,                     ((theColumns - i) * PixelFormats.Gray8.BitsPerPixel + 7) / 8);             System.Console.WriteLine("Got to undo change " + i.ToString());             System.Threading.Thread.Sleep(100);         }     } } 

Now, if I'm explicit in calling the garbage collector, I have to wrap the entire code in an outer loop to cause the OOM crash. For me, this tends to happen around x = 50 or so:

public partial class Window1 : Window {     public Window1() {         InitializeComponent();         //Attempts to create an OOM crash         //to do so, mimic minute croppings of an 'image' (ushort array), and then undoing the crops         for (int x = 0; x < 1000; x++){             int theRows = 4000, currRows;             int theColumns = 4000, currCols;             int theMaxChange = 30;             int i;             List<ushort[]> theList = new List<ushort[]>();//the list of images in the undo/redo stack             byte[] displayBuffer = null;//the buffer used as a bitmap source             BitmapSource theSource = null;             for (i = 0; i < theMaxChange; i++) {                 currRows = theRows - i;                 currCols = theColumns - i;                 theList.Add(new ushort[(theRows - i) * (theColumns - i)]);                 displayBuffer = new byte[theList[i].Length];                 theSource = BitmapSource.Create(currCols, currRows,                         96, 96, PixelFormats.Gray8, null, displayBuffer,                         (currCols * PixelFormats.Gray8.BitsPerPixel + 7) / 8);             }             //should get here.  If not, then theMaxChange is too large.             //Now, go back up the undo stack.             for (i = theMaxChange - 1; i >= 0; i--) {                 displayBuffer = new byte[theList[i].Length];                 theSource = BitmapSource.Create((theColumns - i), (theRows - i),                         96, 96, PixelFormats.Gray8, null, displayBuffer,                         ((theColumns - i) * PixelFormats.Gray8.BitsPerPixel + 7) / 8);                 GC.WaitForPendingFinalizers();//force gc to collect, because we're in scenario 2, lots of large random changes                 GC.Collect();             }             System.Console.WriteLine("Got to changelist " + x.ToString());             System.Threading.Thread.Sleep(100);         }     } } 

If I'm mishandling memory in either scenario, if there's something I should spot with a profiler, let me know. That's a pretty simple routine there.

Unfortunately, it looks like @Kevin's answer is right-- this is a bug in .NET and how .NET handles objects larger than 85k. This situation strikes me as exceedingly strange; could Powerpoint be rewritten in .NET with this kind of limitation, or any of the other Office suite applications? 85k does not seem to me to be a whole lot of space, and I'd also think that any program that uses so-called 'large' allocations frequently would become unstable within a matter of days to weeks when using .NET.

EDIT: It looks like Kevin is right, this is a limitation of .NET's GC. For those who don't want to follow the entire thread, .NET has four GC heaps: gen0, gen1, gen2, and LOH (Large Object Heap). Everything that's 85k or smaller goes on one of the first three heaps, depending on creation time (moved from gen0 to gen1 to gen2, etc). Objects larger than 85k get placed on the LOH. The LOH is never compacted, so eventually, allocations of the type I'm doing will eventually cause an OOM error as objects get scattered about that memory space. We've found that moving to .NET 4.0 does help the problem somewhat, delaying the exception, but not preventing it. To be honest, this feels a bit like the 640k barrier-- 85k ought to be enough for any user application (to paraphrase this video of a discussion of the GC in .NET). For the record, Java does not exhibit this behavior with its GC.

like image 559
mmr Avatar asked May 18 '10 20:05

mmr


People also ask

How do I force garbage collection in Visual Studio?

To force a garbage collection, use the hotkey: Ctrl+Alt+Shift+F12, Ctrl+Alt+Shift+F12 (press it twice). If forcing garbage collection reliably makes your scenario work, file a report through the Visual Studio feedback tool as this behavior is likely to be a bug.

Does .NET have a garbage collector?

. NET's garbage collector manages the allocation and release of memory for your application. Each time you create a new object, the common language runtime allocates memory for the object from the managed heap.

Can garbage collection be forced in C#?

You can force garbage collection either to all the three generations or to a specific generation using the GC. Collect() method. The GC. Collect() method is overloaded -- you can call it without any parameters or even by passing the generation number you would like to the garbage collector to collect.


1 Answers

Here are some articles detailing problems with the Large Object Heap. It sounds like what you might be running into.

http://connect.microsoft.com/VisualStudio/feedback/details/521147/large-object-heap-fragmentation-causes-outofmemoryexception

Dangers of the large object heap:
http://www.simple-talk.com/dotnet/.net-framework/the-dangers-of-the-large-object-heap/

Here is a link on how to collect data on the Large Object Heap (LOH):
http://msdn.microsoft.com/en-us/magazine/cc534993.aspx

According to this, it seems there is no way to compact the LOH. I can't find anything newer that explicitly says how to do it, and so it seems that it hasn't changed in the 2.0 runtime:
http://blogs.msdn.com/maoni/archive/2006/04/18/large-object-heap.aspx

The simple way of handling the issue is to make small objects if at all possible. Your other option to is to create only a few large objects and reuse them over and over. Not an idea situation, but it might be better than re-writing the object structure. Since you did say that the created objects (arrays) are of different sizes, it might be difficult, but it could keep the application from crashing.

like image 118
kemiller2002 Avatar answered Sep 18 '22 12:09

kemiller2002