I'm running a Java process with the XX:+HeapDumpOnOutOfMemoryError
JVM flag and seeing the following output:
java.lang.OutOfMemoryError: Java heap space
Dumping heap to /local/disk2/heaps/heapdump.hprof ...
Dump file is incomplete: file size limit
Is there a way to get around this issue?
The -XX:+HeapDumpOnOutOfMemoryError
command-line option tells the HotSpot VM to generate a heap dump when an allocation from the Java heap or the permanent generation cannot be satisfied. There is no overhead in running with this option, and so it can be useful for production systems where OutOfMemoryError takes a long time to surface.
In order to resolve the specific problem that you are facing, you can use one of the following corrective measures:
Measure 1: XX:HeapDumpSegmentSize
The -XX:HeapDumpSegmentSize option specifies an appropriate segment size when generating a segmented HPROF heap dump.
Format
-XX:HeapDumpSegmentSize=size[k|K][m|M][g|G]
Example
java -XX:+HeapDumpOnOutOfMemory -XX:HeapDumpSegmentSize=512M myApp
Default Values
1 GB
Measure 2 -XX:SegmentedHeapDumpThreshold
The -XX:SegmentedHeapDumpThreshold option generates a segmented heap dump (.hprof file, 1.0.2 format) when the heap usage is larger than the specified size.
The segmented HPROF dump format is required to correctly generate heap dumps containing more than 4 GB of data. If the value of -XX:SegmentedHeapDumpThreshold option is set more than 4 GB, heap dumps may not be generated correctly.
Format
-XX:SegmentedHeapDumpThreshold=size
Example
java -XX:SegmentedHeapDumpThreshold=512M myApp
Default Value
2 GB
A simple reason for that exact failure message may be:
There was not enough space in the dump directory.
The dump directory is the current working directory if not altered via the JVM parameter -XX:HeapDumpPath (see documentation).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With