What is the maximum memory the garbage collector can allocate for a .NET process? When i compile to x64, Process.GetCurrentProcess.MaxWorkingSet returns about 1,4GB, but when i compile to AnyCPU (x64) the same number is returned. For x64 it should be more like the "Limit" value that is displayed in the Task Manager. How can i get the correct number that will cause OutOfMemory-Exceptions when exceeded in all cases?
Some examples what the method should return:
1) Machine Configuration: x64-Windows, 4GB physical memory, 4GB page file
-As 64-Bit process: 8GB
-As 32-Bit process: 1.4GB
2) Machine Configuration: x64-Windows, 1GB physical memory, 2GB page file
-As 64-Bit process: 3GB
-As 32-Bit process: 1.4GB
3) Machine Configuration: x32-Windows, 4GB physical memory, 4GB page file
-As 64-Bit process: Won't happen
-As 32-Bit process: 1.4GB
4) Machine Configuration: x32-Windows, 512MB physical memory, 512MB page file
-As 64-Bit process: Won't happen
-As 32-Bit process: 1.0GB
The 2 GB limit refers to a physical memory barrier for a process running on a 32-bit operating system, which can only use a maximum of 2 GB of memory. The problem mainly affects 32-bit versions of operating systems like Microsoft Windows and Linux, although some variants of the latter can overcome this barrier.
Since . NET 1.0 the memory limit of . NET object is 2GB. This means you cannot for example create array which contains elements with more than 2GB in total.
Limits of processors In principle, a 64-bit microprocessor can address 16 EiB (16 × 10246 = 264 = 18,446,744,073,709,551,616 bytes, or about 18.4 exabytes) of memory.
This is data for which memory is being allocated while executing the program. In C++, this allocation is usually performed by the malloc function or new operator. In 32-bit programs the size of dynamically allocated memory is restricted to 2 Gbytes, in 64-bit programs to 8 Tbytes.
Windows can be configured to allocate more page file space on demand, or on request.
Job objects can prevent the consumption of more than a certain amount of memory.
Fragmentation of the heap and the generational nature of it (plus the need to put large stuff in the Large Object Heap)
All these mean that the hard limit is not much use in reality and means answering the question "how much memory could I theoretically allocate" is rather more complex than you think.
Since it is complex anyone asking that question is probably trying to do something wrong and should redirect their question to something more useful.
What are you trying to do that would appear to necessitate such a question?
"I just want to know when the current memory load of the process could get problematic so I can take actions like freeing some items of a custom cache."
Right. this is much more tractable a question.
Two solutions in order of complexity:
Points to note. Is it really less expensive to maintain this massive cache (going to disk by the sounds of it) than to recalculate/re-request the data.
If your cache exhibits poor locality between commonly/consecutively requested items then much effort will be spent paging data in and out. A smaller cache with an effective tuned relpacement policy stands a good chance of performing considerably better (and with much less impact on other running programs)
As an aside: In .Net, no variable sized object (strings, arrays) can be more than 2GB in size due to limitations of the core CLR structures for memory management. (and either solution above will benefit from this)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With