Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can Large Object Heap Fragmentation cause OutOfMemory in 64bit processes?

Tags:

I'm preparing a presentation to my team on .net GC and memory. Different sources discuss the potential impact of fragmentation on Large Object Heap. Since it would be an interesting phenomenon to show, I'm trying to show that in code.

Thomas Weller offered this code which looks like it should cause OOM when trying to allocate a larger object into the freed gaps in LOH but for some reason it doesn't occur. Is LOH being compacted automatically in .net 4.6? Is LOH fragmentation simply not an issue in 64bit?

source: https://stackoverflow.com/a/30361185/3374994

class Program
{
    static IList<byte[]> small = new List<byte[]>();
    static IList<byte[]> big = new List<byte[]>(); 

static void Main()
{
    int totalMB = 0;
    try
    {
        Console.WriteLine("Allocating memory...");
        while (true)
        {
            big.Add(new byte[10*1024*1024]);
            small.Add(new byte[85000-3*IntPtr.Size]);
            totalMB += 10;
            Console.WriteLine("{0} MB allocated", totalMB);
        }
    }
    catch (OutOfMemoryException)
    {
        Console.WriteLine("Memory is full now. Attach and debug if you like. Press Enter when done.");
        Console.WriteLine("For WinDbg, try `!address -summary` and  `!dumpheap -stat`.");
        Console.ReadLine();

        big.Clear();
        GC.Collect();
        Console.WriteLine("Lots of memory has been freed. Check again with the same commands.");
        Console.ReadLine();

        try
        {
            big.Add(new byte[20*1024*1024]);
        }
        catch(OutOfMemoryException)
        {
            Console.WriteLine("It was not possible to allocate 20 MB although {0} MB are free.", totalMB);
            Console.ReadLine();
        }
    }
}
}
like image 582
Nahum Timerman Avatar asked Jan 16 '18 12:01

Nahum Timerman


People also ask

Why is a large object heap bad?

Large objects pose a special problem for the runtime: they can't be reliably moved by copying as they would require twice as much memory for garbage collection. Additionally, moving multi-megabyte objects around would cause the garbage collector to take an unreasonably long time to complete. .

How can we avoid heap fragmentation?

Use a static array of structs, each struct has: A solid chunk of memory that can hold N images -- the chunking will help control fragmentation -- try an initial N of 5 or so. A parallel array of bools indicating whether the corresponding image is in use.

What is large object heap?

If an object is greater than or equal to 85,000 bytes in size, it's considered a large object. This number was determined by performance tuning. When an object allocation request is for 85,000 or more bytes, the runtime allocates it on the large object heap.

What is small object heap?

Small Object Heap has generations that are checked from time to time. At the end of collection this heap is fragmented so it need to be compacte. If Large Objects were in this heep it would take long time for defragmentation.


2 Answers

Since .NET 4.5.1 (also .NET Core) LOH compaction is supported and the behavior could be set by the GCSettings.LargeObjectHeapCompactionMode property of the static class GcSettings.

This means, the LOH is compacted by the GC.

Be aware of that a 32bit process has some limitation on how much memory could be used, therefore it is more likely to run into a OOM exception.

like image 194
CodeTherapist Avatar answered Oct 13 '22 01:10

CodeTherapist


My guess is, LOH will not be compact automatically.

Since compact LOH impact to performance, we should do it when we know it for sure.

With this code, it will run out memory very soon, the compact action only recycle object that did not referenced

like image 34
Max CHien Avatar answered Oct 13 '22 01:10

Max CHien