Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Memory usage between 32-bit pool and 64-bit pool

Tags:

We have an ASP.NET application, built around MonoRail and NHibernate, and I have noticed a strange behavior between if running it with 64-bit mode or 32-bit mode. Everything is compiled as AnyCPU, and runs fine with both modes, but the memory usage differs dramatically.

Look at the following snapshots from ANTS:

32bit_snapshot: enter image description here

vs

64bit_snapshot: enter image description here

The usage scenario for both snapshots are pretty much equivalent (I have hit the same pages on both runs).

Firstly, why is the Unused memory so high in 64-bit mode? And why would unmanaged memory be 4 times the size on 64-bit mode?

Any insight on this would be really helpful.

like image 325
jishi Avatar asked Nov 02 '12 14:11

jishi


People also ask

Does 64-bit use more RAM than 32-bit?

1 Answer. Show activity on this post. The short answer is yes, 64-bit operating systems almost always require more RAM than corresponding 32-bit operating systems and 64-bit applications often do require a bit more RAM than corresponding 32-bit applications.

How much memory can a 64-bit application use?

The theoretical memory limit that a 64-bit computer can address is about 16 exabytes (16 billion gigabytes), Windows XP x64 is currently limited to 128 GB of physical memory and 8 TB of virtual memory. In the future this limit will be increased, basically because hardware capabilities will improve.

How much memory can a 32-bit program use?

The 2 GB limit refers to a physical memory barrier for a process running on a 32-bit operating system, which can only use a maximum of 2 GB of memory. The problem mainly affects 32-bit versions of operating systems like Microsoft Windows and Linux, although some variants of the latter can overcome this barrier.

Can a 32-bit program use more than 4GB of RAM?

A 32-bit application can allocate more than 4GB of memory, and you don't need 64-bit Windows to do it.


Video Answer


2 Answers

The standard answer to memory issues with 64-bit systems is that most memory operations by default are aligned to 16 bytes. Memory reads to/from 128-bit XXM registers are expected to align with 16-byte boundaries. Two parameters in a stack take the same amount of memory as three (return address takes the missing 8 bytes). Gnu malloc aligns allocated areas to 16 byte boundaries.

If the size of the allocated units is small, then the overhead will be huge: first the overhead from aligning the data and then there's the overhead of aligning the bookkeeping associated to the data.

Also I'd predict that in 64-bit systems the data structures have evolved: instead of binary, or 2-3-4, balanced, splay or whatever trees it possibly makes sense to have radix 16 trees that can have a lot of slack but can be processed fast with SSE extensions that are guaranteed to be there.

like image 42
Aki Suihkonen Avatar answered Sep 30 '22 01:09

Aki Suihkonen


The initial memory allocation for a 64 bit process is much higher than it would be for an equivalent 32 bit process.

Theoretically this allows garbage collection to run much less often, which should increase performance. It also helps with fragmentation as larger memory blocks are allocated at a time.

This article: http://blogs.msdn.com/b/maoni/archive/2007/05/15/64-bit-vs-32-bit.aspx gives a more detailed explanation.

The higher unmanaged memory usage you are seeing is probably due to the fact that .NET objects running in 32 bit mode use a minimum of 12 bytes (8 bytes + 4 byte reference) while the same object in 64 bit would take 24 bytes (12 bytes + 8 byte reference).

Another article to explain this more completely: http://www.simple-talk.com/dotnet/.net-framework/object-overhead-the-hidden-.net-memory--allocation-cost/

like image 191
CIGuy Avatar answered Sep 30 '22 01:09

CIGuy