I've got this webapp that needs some memory tuning. While I'm already profiling the application itself and trimming things down, the JVM itself seems overly bloated to me on our busiest instance. (The lower volume instances do not have this problem.) The details:
Linux 2.6.9-78.0.5.ELsmp #1 SMP x86_64
)Java HotSpot(TM) 64-Bit Server VM (build 10.0-b23, mixed mode)
) -d64
in startup.sh
If I could refactor out the need for a 64-bit JVM, and drop the -d64
switch, would that make the JVM's resident memory footprint smaller? In other words...
What impact, if any, does the -d64
switch have on the Sun JVM resident memory usage?
OutOfMemoryError is a runtime error in Java which occurs when the Java Virtual Machine (JVM) is unable to allocate an object due to insufficient space in the Java heap. The Java Garbage Collector (GC) cannot free up the space required for a new object, which causes a java. lang.
JVM memory usage The JVM uses memory in a number of different ways. The primary, but not singular, use of memory is in the heap. Outside of the heap, memory is also consumed by Metaspace and the stack. Java Stack - Each thread has its own call stack.
This is because the JVM steadily increases heap usage percentage until the garbage collection process frees up memory again. High heap usage occurs when the garbage collection process cannot keep up. An indicator of high heap usage is when the garbage collection is incapable of reducing the heap usage to around 30%.
Usage of the d64 switch gets the JVM into the 64-bit mode. Technically, on Solaris/Linux and most Unixes, the JVM process will execute in the LP64 model.
The LP64 model is different from the 32-bit model (ILP32) in that pointers happen to be 64 bit wide as opposed to 32 bit pointers. For the JVM, this allows for greater memory addressability, but it also means that the size occupied by the object references alone has doubled. So there is greater bloat for the same number of objects at a given time in a 32-bit JVM and a 64-bit one.
Another thing that is often forgotten is the size of the instructions themselves. On a 64-bit JVM, the size of the instructions will occupy native machine register size.
If however, you use compressed object pointers in a 64-bit environment, the JVM will encode and decode pointers whenever possible for heap sizes greater than 4 GB. Briefly stated, when you use compressed pointers, the JVM attempts to use 32-bit wide values as much as possible.
Hint: Switch on the UseCompressedOops flag, using -XX:+UseCompressedOops to get rid of some of the bloat. YMMV, but people have reported upto 50% drop in memory bloat by using compressed oops.
EDIT
The UseCompressedOops flag is supported in version 14.0 of the Java HotSpot VM, available from Java 6 Update 14 onwards.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With