Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Large Object Heap Fragmentation

The C#/.NET application I am working on is suffering from a slow memory leak. I have used CDB with SOS to try to determine what is happening but the data does not seem to make any sense so I was hoping one of you may have experienced this before.

The application is running on the 64 bit framework. It is continuously calculating and serialising data to a remote host and is hitting the Large Object Heap (LOH) a fair bit. However, most of the LOH objects I expect to be transient: once the calculation is complete and has been sent to the remote host, the memory should be freed. What I am seeing, however, is a large number of (live) object arrays interleaved with free blocks of memory, e.g., taking a random segment from the LOH:

0:000> !DumpHeap 000000005b5b1000  000000006351da10          Address               MT     Size ... 000000005d4f92e0 0000064280c7c970 16147872 000000005e45f880 00000000001661d0  1901752 Free 000000005e62fd38 00000642788d8ba8     1056       <-- 000000005e630158 00000000001661d0  5988848 Free 000000005ebe6348 00000642788d8ba8     1056 000000005ebe6768 00000000001661d0  6481336 Free 000000005f214d20 00000642788d8ba8     1056 000000005f215140 00000000001661d0  7346016 Free 000000005f9168a0 00000642788d8ba8     1056 000000005f916cc0 00000000001661d0  7611648 Free 00000000600591c0 00000642788d8ba8     1056 00000000600595e0 00000000001661d0   264808 Free ... 

Obviously I would expect this to be the case if my application were creating long-lived, large objects during each calculation. (It does do this and I accept there will be a degree of LOH fragmentation but that is not the problem here.) The problem is the very small (1056 byte) object arrays you can see in the above dump which I cannot see in code being created and which are remaining rooted somehow.

Also note that CDB is not reporting the type when the heap segment is dumped: I am not sure if this is related or not. If I dump the marked (<--) object, CDB/SOS reports it fine:

0:015> !DumpObj 000000005e62fd38 Name: System.Object[] MethodTable: 00000642788d8ba8 EEClass: 00000642789d7660 Size: 1056(0x420) bytes Array: Rank 1, Number of elements 128, Type CLASS Element Type: System.Object Fields: None 

The elements of the object array are all strings and the strings are recognisable as from our application code.

Also, I am unable to find their GC roots as the !GCRoot command hangs and never comes back (I have even tried leaving it overnight).

So, I would very much appreciate it if anyone could shed any light as to why these small (<85k) object arrays are ending up on the LOH: what situations will .NET put a small object array in there? Also, does anyone happen to know of an alternative way of ascertaining the roots of these objects?


Update 1

Another theory I came up with late yesterday is that these object arrays started out large but have been shrunk leaving the blocks of free memory that are evident in the memory dumps. What makes me suspicious is that the object arrays always appear to be 1056 bytes long (128 elements), 128 * 8 for the references and 32 bytes of overhead.

The idea is that perhaps some unsafe code in a library or in the CLR is corrupting the number of elements field in the array header. Bit of a long shot I know...


Update 2

Thanks to Brian Rasmussen (see accepted answer) the problem has been identified as fragmentation of the LOH caused by the string intern table! I wrote a quick test application to confirm this:

static void Main() {     const int ITERATIONS = 100000;      for (int index = 0; index < ITERATIONS; ++index)     {         string str = "NonInterned" + index;         Console.Out.WriteLine(str);     }      Console.Out.WriteLine("Continue.");     Console.In.ReadLine();      for (int index = 0; index < ITERATIONS; ++index)     {         string str = string.Intern("Interned" + index);         Console.Out.WriteLine(str);     }      Console.Out.WriteLine("Continue?");     Console.In.ReadLine(); } 

The application first creates and dereferences unique strings in a loop. This is just to prove that the memory does not leak in this scenario. Obviously it should not and it does not.

In the second loop, unique strings are created and interned. This action roots them in the intern table. What I did not realise is how the intern table is represented. It appears it consists of a set of pages -- object arrays of 128 string elements -- that are created in the LOH. This is more evident in CDB/SOS:

0:000> .loadby sos mscorwks 0:000> !EEHeap -gc Number of GC Heaps: 1 generation 0 starts at 0x00f7a9b0 generation 1 starts at 0x00e79c3c generation 2 starts at 0x00b21000 ephemeral segment allocation context: none  segment    begin allocated     size 00b20000 00b21000  010029bc 0x004e19bc(5118396) Large object heap starts at 0x01b21000  segment    begin allocated     size 01b20000 01b21000  01b8ade0 0x00069de0(433632) Total Size  0x54b79c(5552028) ------------------------------ GC Heap Size  0x54b79c(5552028) 

Taking a dump of the LOH segment reveals the pattern I saw in the leaking application:

0:000> !DumpHeap 01b21000 01b8ade0 ... 01b8a120 793040bc      528 01b8a330 00175e88       16 Free 01b8a340 793040bc      528 01b8a550 00175e88       16 Free 01b8a560 793040bc      528 01b8a770 00175e88       16 Free 01b8a780 793040bc      528 01b8a990 00175e88       16 Free 01b8a9a0 793040bc      528 01b8abb0 00175e88       16 Free 01b8abc0 793040bc      528 01b8add0 00175e88       16 Free    total 1568 objects Statistics:       MT    Count    TotalSize Class Name 00175e88      784        12544      Free 793040bc      784       421088 System.Object[] Total 1568 objects 

Note that the object array size is 528 (rather than 1056) because my workstation is 32 bit and the application server is 64 bit. The object arrays are still 128 elements long.

So the moral to this story is to be very careful interning. If the string you are interning is not known to be a member of a finite set then your application will leak due to fragmentation of the LOH, at least in version 2 of the CLR.

In our application's case, there is general code in the deserialisation code path that interns entity identifiers during unmarshalling: I now strongly suspect this is the culprit. However, the developer's intentions were obviously good as they wanted to make sure that if the same entity is deserialised multiple times then only one instance of the identifier string will be maintained in memory.

like image 499
Paul Ruane Avatar asked Mar 26 '09 18:03

Paul Ruane


People also ask

What is the large object heap?

If an object is greater than or equal to 85,000 bytes in size, it's considered a large object. This number was determined by performance tuning. When an object allocation request is for 85,000 or more bytes, the runtime allocates it on the large object heap.

Why is a large object heap bad?

Large objects pose a special problem for the runtime: they can't be reliably moved by copying as they would require twice as much memory for garbage collection. Additionally, moving multi-megabyte objects around would cause the garbage collector to take an unreasonably long time to complete.

What is fragmentation in garbage collection?

Fragmentation is a state of the heap where free memory is available but not in a big enough consecutive memory space to host a new object about to be allocated.

How does .NET GC work?

NET's garbage collector manages the allocation and release of memory for your application. Each time you create a new object, the common language runtime allocates memory for the object from the managed heap.


2 Answers

The CLR uses the LOH to preallocate a few objects (such as the array used for interned strings). Some of these are less than 85000 bytes and thus would not normally be allocated on the LOH.

It is an implementation detail, but I assume the reason for this is to avoid unnecessary garbage collection of instances that are supposed to survive as long as the process it self.

Also due to a somewhat esoteric optimization, any double[] of 1000 or more elements is also allocated on the LOH.

like image 187
Brian Rasmussen Avatar answered Oct 04 '22 08:10

Brian Rasmussen


The .NET Framework 4.5.1, has the ability to explicitly compact the large object heap (LOH) during garbage collection.

GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce; GC.Collect(); 

See more info in GCSettings.LargeObjectHeapCompactionMode

like image 38
Andre Abrantes Avatar answered Oct 04 '22 06:10

Andre Abrantes