On my laptop, running 64 bit Windows 7 and with 2 Gb of free memory (as reported by Task Manager), I'm able to do:
var x = new Dictionary<Guid, decimal>( 30 * 1024 *1024 );
Without having a computer with more RAM at my hands, I'm wondering if this will scale so that on a computer with 4 Gb free memory, I'll be able to allocate 60M items instead of "just" 30M and so on?
Or are there other limitations (of .Net and/or Windows) that I'll bump into before I'm able to consume all available RAM?
Update: OK, so I'm not allowed to allocate a single object larger than 2 Gb. That's important to know! But then I'm of course curious to know if I'll be able to fully utilize all memory by allocating 2 Gb chunks like this:
var x = new List<Dictionary<Guid, decimal>>(); for ( var i = 0 ; i < 10 ; i++ ) x.Add( new Dictionary<Guid, decimal>( 30 * 1024 *1024 ) );
Would this work if the computer have >20Gb free memory?
The theoretical memory limit that a 64-bit computer can address is about 16 exabytes (16 billion gigabytes), Windows XP x64 is currently limited to 128 GB of physical memory and 8 TB of virtual memory. In the future this limit will be increased, basically because hardware capabilities will improve.
Maybe you don't know it, but all versions of . Net until the last one (1.0, 2.0, 3.0, 3.5 and 4.0) have a limit on the maximum size a single object can have: 2 GB. No matter if you are running in a 64bit or 32bit process, you cannot create anything bigger than that, in a single object.
However, a common question is, does the 64-bit version actually use more memory than the 32-bit counterpart? The answer is yes, but it's generally only a small difference, maybe 100MB of extra memory space.
For a 64-bit versions of R under 64-bit Windows the limit is currently 8Tb.
There's a 2 GiB limitation on all objects in .NET, you are never allowed to create a single object that exceeds 2 GiB. If you need a bigger object you need to make sure that the objects is built from parts smaller than 2 GiB, so you cannot have an array of continuous bits larger than 2 GiB or a single string longer larger than 512 MiB, I'm not entirely sure about the string but I've done some testing on the issue and was getting OutOfMemoryExceptions when I tried to allocate strings bigger than 512 MiB.
These limits though are subject to heap fragmentation and even if the GC does try to compact the heap, large objects (which is somewhat of an arbitrary cross over around 80K) end up on the large object heap which is a heap that isn't compacted. Strictly speaking, and somewhat of a side note, if you can maintain short lived allocations below this threshold it would be better for your overall GC memory management and performance.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With