I have a HotSpot JVM heap dump that I would like to analyze. The VM ran with -Xmx31g
, and the heap dump file is 48 GB large.
jhat
, as it requires about five times the heap memory (that would be 240 GB in my case) and is awfully slow.ArrayIndexOutOfBoundsException
after analyzing the heap dump for several hours.What other tools are available for that task? A suite of command line tools would be best, consisting of one program that transforms the heap dump into efficient data structures for analysis, combined with several other tools that work on the pre-structured data.
Eclipse Memory Analyzer Tool ( MAT ) is used for analyzing heap dump files ( see Capturing heap dumps before FullGCs to troubleshoot memory problems ) which contain objects in memory. Each heap dump file can be thought of as a snapshot in time and details the memory occupied by specific JVM threads.
The recommended tool is IBM Monitoring and Diagnostic Tools for Java - Memory Analyzer. Note that heap dumps are platform agnostic and can be analysed on any platform regardless of where they were created.
A dump can be taken on demand (using the jmap JDK utility) or when an app fails with OutOfMemoryError (if the JVM was started with the -XX:+HeapDumpOnOutOfMemoryError command line option). A heap dump is a binary file of about the size of your JVM's heap, so it can only be read and analyzed with special tools.
Normally, what I use is ParseHeapDump.sh
included within Eclipse Memory Analyzer and described here, and I do that onto one our more beefed up servers (download and copy over the linux .zip distro, unzip there). The shell script needs less resources than parsing the heap from the GUI, plus you can run it on your beefy server with more resources (you can allocate more resources by adding something like -vmargs -Xmx40g -XX:-UseGCOverheadLimit
to the end of the last line of the script. For instance, the last line of that file might look like this after modification
./MemoryAnalyzer -consolelog -application org.eclipse.mat.api.parse "$@" -vmargs -Xmx40g -XX:-UseGCOverheadLimit
Run it like ./path/to/ParseHeapDump.sh ../today_heap_dump/jvm.hprof
After that succeeds, it creates a number of "index" files next to the .hprof file.
After creating the indices, I try to generate reports from that and scp those reports to my local machines and try to see if I can find the culprit just by that (not just the reports, not the indices). Here's a tutorial on creating the reports.
Example report:
./ParseHeapDump.sh ../today_heap_dump/jvm.hprof org.eclipse.mat.api:suspects
Other report options:
org.eclipse.mat.api:overview
and org.eclipse.mat.api:top_components
If those reports are not enough and if I need some more digging (i.e. let's say via oql), I scp the indices as well as hprof file to my local machine, and then open the heap dump (with the indices in the same directory as the heap dump) with my Eclipse MAT GUI. From there, it does not need too much memory to run.
EDIT: I just liked to add two notes :
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With