I am getting an Out Of Memory exception in my C# application when the memory usage for the application goes over about 1.3GB.
I had this same problem on a 32-bit machine with 3GB of memory and it made sense back then. But now I upgraded the hardware to a 64-bit machine with 16GB memory using a high-end motherboard and high-end RAM, but the Out Of Memory exception still occurs after 1.3GB!
I know that there are no single objects over 2GB and 1.3 is less than 2GB anyway, so the built-in MS 2GB limit on a single object is not likely to be the problem.
It seems like there is a Windows kill-switch of some sort when an app reaches a certain memory usage threshold. Then there should be a way to configure this. Is it in the registry perhaps?
Any help will be greatly appreciated!
Well, according to the topic of the question, best way to avoid out of memory exception would be not to create objects that fill in that memory. Then you can calculate the length of your queue based on estimate of one object memory capacity. Another way would be to check for memory size in each worker thread.
Maybe you don't know it, but all versions of . Net until the last one (1.0, 2.0, 3.0, 3.5 and 4.0) have a limit on the maximum size a single object can have: 2 GB. No matter if you are running in a 64bit or 32bit process, you cannot create anything bigger than that, in a single object.
There is no difference until you compile to same target architecture. I suppose you are compiling for 32
bit architecture in both cases.
It's worth mentioning that OutOfMemoryException
can also be raised if you get 2GB
of memory allocated by a single collection in CLR (say List<T>
) on both architectures 32
and 64
bit.
To be able to benefit from memory goodness on 64
bit architecture, you have to compile your code targeting 64
bit architecture. After that, naturally, your binary will run only on 64
bit, but will benefit from possibility having more space available in RAM.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With