I'm running the following method on my development IIS server (from VS2010 IDE) on a 64-bit Windows 7 machine with 16GB of installed RAM:
public static MemoryStream copyStreamIntoMemoryStream(Stream stream) { long uiLen = stream.Length; byte[] buff = new byte[0x8000]; int nSz; MemoryStream ms = new MemoryStream(); try { while ((nSz = stream.Read(buff, 0, buff.Length)) != 0) { ms.Write(buff, 0, nSz); } } finally { Debug.WriteLine("Alloc size=" + ms.Length); } return ms; }
and I get the System.OutOfMemoryException
on this line:
ms.Write(buff, 0, nSz);
That is thrown when 268435456 bytes are allocated:
Alloc size=268435456
which is 0x10000000 or 256 MB. So I'm wondering if there's some global setting that I need to set to make it work?
Here's a screenshot of the configuration setting for the project:
When data structures or data sets that reside in memory become so large that the common language runtime is unable to allocate enough contiguous memory for them, an OutOfMemoryException exception results.
The OutOfMemoryException is a runtime exception that tells the programmer that there is no enough memory or there is a lack of contiguous memory for the allocations required by the C# program. To avoid this exception the user should always take necessary precautions and should handle this exception.
Short answer - dev server is 32bit process.
Long answer for "why just 256Mb?"
First of all, let's understand how it works.
MemoryStream has internal byte[] buffer to keep all the data. It cannot predict exact size of this buffer, so it just initializes it with some initial value.
Position and Length properties don't reflect actual buffer size - they are logical values to reflect how many bytes is written, and easily may be smaller than actual physical buffer size.
When this internal buffer can not fit all the data, it should be "re-sized", but in real life it means creating new buffer twice as size as previous one, and then copying data from old buffer to new buffer.
So, if the length of your buffer is 256Mb, and you need new data to be written, this means that .Net need to find yet another 512Mb block of data - having all the rest in place, so heap should be at least 768Mb on the moment of memory allocation when you receive OutOfMemory.
Also please notice that by default no single object, including arrays, in .Net can take more than 2Gb in size.
Ok, so here is the sample piece which simulates what's happening:
byte[] buf = new byte[32768 - 10]; for (; ; ) { long newSize = (long)buf.Length * 2; Console.WriteLine(newSize); if (newSize > int.MaxValue) { Console.WriteLine("Now we reach the max 2Gb per single object, stopping"); break; } var newbuf = new byte[newSize]; Array.Copy(buf, newbuf, buf.Length); buf = newbuf; }
If it built in x64/AnyCPU and runs from console - everything is ok.
If it built across x86 - it fails in console.
If you put it to say Page_Load, built in x64, and open from VS.Net web server - it fails.
If you do the same with IIS - everything is ok.
Hope this helps.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With