Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why do I get OutOfMemory when 20% of the heap is still free?

I've set the max heap to 8 GB. When my program starts using about 6.4 GB (as reported in VisualVM), the garbage collector starts taking up most of the CPU and the program crashes with OutOfMemory when making a ~100 MB allocation. I am using Oracle Java 1.7.0_21 on Windows.

My question is whether there are GC options that would help with this. I'm not passing anything except -Xmx8g.

My guess is the heap is getting fragmented, but shouldn't the GC compact it?

like image 467
Aleksandr Dubinsky Avatar asked Jun 11 '13 18:06

Aleksandr Dubinsky


People also ask

How can we avoid OutOfMemoryError in Java?

As explained in the above paragraph this OutOfMemory error in java comes when the Permanent generation of heap is filled up. To fix this OutOfMemoryError in Java, you need to increase the heap size of the Perm space by using the JVM option "-XX: MaxPermSize".

Which of these could potentially throw an OutOfMemory error?

OutOfMemoryError exception. Usually, this error is thrown when there is insufficient space to allocate an object in the Java heap. In this case, The garbage collector cannot make space available to accommodate a new object, and the heap cannot be expanded further.

How do you solve heap memory problems?

There are several ways to eliminate a heap memory issue: Increase the maximum amount of heap available to the VM using the -Xmx VM argument. Use partitioning to distribute the data over additional machines. Overflow or expire the region data to reduce the heap memory footprint of the regions.

What is heap exhaustion?

The Java™ heap becomes exhausted when garbage collection cannot free enough objects to make a new object allocation. Garbage collection can free only objects that are no longer referenced by other objects, or are referenced from the thread stacks (see Memory management for more details).


1 Answers

Collecting bits and pieces of information (which is surprisingly difficult, since the official documentation is quite bad), I've determined...

There are generally two reasons this may happen, both related to fragmentation of free space (ie, free space existing in small pieces such that a large object cannot be allocated). First, the garbage collector might not do compaction, which is to say it does not defragment the memory. Even a collector that does compaction may not do it perfectly well. Second, the garbage collector typically splits the memory area into regions that it reserves for different kinds of objects, and it may not think to take free memory from the region that has it to give to the region that needs it.

The CMS garbage collector does not do compaction, while the others (the serial, parallel, parallelold, and G1) do. The default collector in Java 8 is ParallelOld.

All garbage collectors split memory into regions, and, AFAIK, all of them are too lazy to try very hard to prevent an OOM error. The command line option -XX:+PrintGCDetails is very helpful for some of the collectors in showing the sizes of the regions and how much free space they have.

It is possible to experiment with different garbage collectors and tuning options. Regarding my question, the G1 collector (enabled with the JVM flag -XX:+UseG1GC) solved the issue I was having. However, this was basically down to chance (in other situations, it OOMs more quickly). Some of the collectors (the serial, cms, and G1) have extensive tuning options for selecting the sizes of the various regions, to enable you to waste time in futilely trying to solve the problem.

Ultimately, the real solutions are rather unpleasant. First, is to install more RAM. Second, is to use smaller arrays. Third, is to use ByteBuffer.allocateDirect. Direct byte buffers (and their int/float/double wrappers) are array-like objects with array-like performance that are allocated on the OS's native heap. The OS heap uses the CPU's virtual memory hardware and is free from fragmentation issues and can even effectively use the disk's swap space (allowing you to allocate more memory than available RAM). A big drawback, however, is that the JVM doesn't really know when direct buffers should be deallocated, making this option more desirable for long-lived objects. The final, possibly best, and certainly most unpleasant option is to allocate and deallocate memory natively using JNI calls, and use it in Java by wrapping it in a ByteBuffer.

like image 99
Aleksandr Dubinsky Avatar answered Oct 05 '22 04:10

Aleksandr Dubinsky