Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tomcat memory consumption is more than heap + permgen space

I am observing a mismatch in Tomcat RAM consumption between what the OS says and what jVisualVM says.

From htop, the Tomcat JVM is has 993 MB of resident memory

From jVisualVM, the Tomcat JVM is using

  • Heap Max: 1,070,399,488 B
  • Heap Size: 298.438.656 B
  • Heap Used: variable, between 170MB and and 270MB
  • PermGen Max: 268,435,456 B
  • PermGen Size: 248,872,960 B
  • PermGen Used: slightly variable, around 150MB

From my understanding the OS memory consumption should be Heap Size + PermGen Size ~= 522 MB. But that's 471 MB less than what I'm observing.

Anyone got an idea what am I missing here?

PS: I know that my max heap is much higher than what is used, but I'm assuming that should have no effect if the JVM does not use it (i.e. Heap Size is lower).

Thanks! Marc

like image 455
Marc Avatar asked Nov 07 '11 14:11

Marc


2 Answers

From my understanding the OS memory consumption should be Heap Size + PermGen Size ~= 522 MB. But that's 471 MB less than what I'm observing. Anyone got an idea what am I missing here?

If I understand the question what you are seeing is a combination of memory fragmentation and JVM memory overhead in other areas. We often see 2 times the memory usage for our production programs than we would expect to see from our memory settings.

Memory fragmentation can mean that although the JVM thinks that the OS has given it some number of bytes, there is a certain addition number of bytes that had to be given because of memory subsystem optimizations.

In terms of JVM overhead, there are a number of other storage areas that are not included in the standard memory configs. Here's a good discussion about this. To quote:

The following are examples of things that are not part of the garbage collected heap and yet are part of the memory required by the process:

  • Code to implement the JVM
  • The C manual heap for data structures implementing the JVM
  • Stacks for all of the threads in the system (app + JVM)
  • Cached Java bytecode (for libraries and the application)
  • JITed machine code (for libraries and the application)
  • Static variables of all loaded classes
like image 193
Gray Avatar answered Oct 22 '22 05:10

Gray


The first thing we have to bear in mind is that: JVM process heap (OS process) = Java object heap + [Permanent space + Code generation + Socket buffers + Thread stacks + Direct memory space + JNI code + JNI allocated memory + Garbage collection], where in this "collection" permSpace is usually the bigest chunk.

Given that, I guess the key here is the JVM option -XX:MinFreeHeapRatio=n, where n is from 0 to 100, and it specifies that the heap should be expanded if less than n% of the heap is free. It is usually 40 by default (Sun), so when the JVM allocates memory, it gets enough to get 40% free (this is not applicable if you have -Xms == -Xmx). Its "twin option", -XX:MaxHeapFreeRatio usually defaults to 70 (Sun).

Therefore, in a Sun JVM the ratio of living objects at each garbage collection is kept within 40-70%. If less than 40% of the heap is free after a GC, then the heap is expanded. So assuming you are running a Sun JVM, I would guess that the size of the "java object heap" has reached a peak of about 445Mb, thus producing an expanded "object heap" of about 740 Mb (to guarantee a 40% free). Then, (object heap) + (perm space) = 740 + 250 = 990 Mb.

Maybe you can try to output GC details or use jconsole to verify the evolution of the heap size.

P.S.: when dealing with issues like this, it is good to post OS and JVM details.

like image 40
jalopaba Avatar answered Oct 22 '22 05:10

jalopaba