Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

.NET garbage collector and x64 virtual memory

Running a .NET application on Windows Server 2008 x64 with 16GB of RAM. This application needs to fetch and analyze a very large amount of data (about 64GB), and keep it all in memory at one time.

What I expect to see: Process size expands past 16GB to 64GB. Windows uses virtual memory to page the extra data to/from disk as needed. This is the classic virtual memory use case.

What I actually see: Process size is limited to the amount of physical memory (16GB). Application spends 99.8% of its time in the garbage collector.

Why is our application failing to use virtual memory? Is this a problem in the configuration of the .NET garbage collector, or in the Windows x64 virtual memory manager itself? What can I do to get our application to use virtual memory rather than be limited to physical memory?

Thanks.

-- Brian

Update: I have written a very small program that exhibits the same behavior:

using System;

namespace GCTest
{
    class Program
    {
        static void Main()
        {
            byte[][] arrays = new byte[100000000][];
            for (int i = 0; i < arrays.Length; ++i)
            {
                arrays[i] = new byte[320];
                if (i % 100000 == 0)
                {
                    Console.WriteLine("{0} arrays allocated", i);
                    System.Threading.Thread.Sleep(100);
                }
            }
        }
    }
}

If you want to try it, make sure to build for x64. You may have to modify the constants a bit to stress your system. The behavior I see is that the process bogs down as it approaches a size of 16GB. There is no error message or exception thrown. Performance monitor reports that the % of CPU time in GC approaches 100%.

Isn't this unacceptable? Where's the virtual memory system?

like image 999
Brian Berns Avatar asked Jun 22 '10 14:06

Brian Berns


1 Answers

Have you checked to make sure that your paging file is configured so that it can expand to that size?

Update

I've been playing around with this quite a bit with your given example, and here's what I see.

System: Windows 7 64bit, 6GB of triple-channel RAM, 8 cores.

  1. You need an additional paging file on another spindle from your OS or this sort of investigation will hose your machine. If everything is fighting over the same paging file, it makes things worse.

  2. I am seeing a large amount of data being promoted from generation to generation in the GC, plus a large number of GC sweeps\collections, and a massive amount of page faults as a result as physical memory limits are reached. I can only assume that when physical memory is exhausted\very high, that this triggers generation sweeps and promotions thus causing a large amount of paged-out memory to be touched which is leading to a death spriral as touched memory is paged in and other memory is forced out. The whole thing ends in a soggy mess. This seems to be inevitable when allocating a large number of long-lived objects which end up in the Small Object Heap.

Now compare this to allocating objects in a fashion will allocate them directly into the Large Object Heap (which does not suffer the same sweeping and promotion issues):

private static void Main()
{
    const int MaxNodeCount = 100000000;
    const int LargeObjectSize = (85 * 1000);

    LinkedList<byte[]> list = new LinkedList<byte[]>();

    for (long i = 0; i < MaxNodeCount; ++i)
    {
        list.AddLast(new byte[LargeObjectSize]);

        if (i % 100000 == 0)
        {
            Console.WriteLine("{0:N0} 'approx' extra bytes allocated.",
               ((i + 1) * LargeObjectSize));
        }
    }
}

This works as expected i.e. virtual memory is used and then eventually exhausted - 54GB in my environment\configuration.

So it appears that allocating a mass of long-lived small objects will eventually lead to a vicious cycle in the GC as generation sweeps and promotions are made when physical memory has been exhausted - it's a page-file death spiral.

Update 2

Whilst investigating the issue I played with a number of options\configurations which made no appreciable difference:

  • Forcing Server GC mode.
  • Configuring low latency GC.
  • Various combinations of forcing GC to try to amortize GC.
  • Min\Max process working sets.
like image 134
Tim Lloyd Avatar answered Oct 25 '22 10:10

Tim Lloyd