Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to investigate excessive java garbage collection

Tags:

java

tomcat

I have a Tomcat instance which is exhibiting the following behaviour:

  • Accept a single http incoming request.
  • Issue one request to a backend server and get back about 400kb of XML.
  • Pass through this XML and transform it into about 400kb of JSON.
  • Return the JSON response.

The problem is that in the course of handling the 400k request my webapp generates about 100mb of garbage which fills up the Eden space and triggers a young generation collection.

I have tried to use the built in java hprof functionality to do allocation sites profiling but Tomcat didn't seem to start up properly with that in place. It is possible that I was just a bit impatient as I imagine memory allocation profiling has a high overhead and therefore tomcat startup might take a long time

What are the best tools to use to do java memory profiling of very young objects/garbage? I can't use heap dumps because the objects I'm interested in are garbage.

like image 307
mchr Avatar asked Jul 03 '10 10:07

mchr


2 Answers

As to the actual problem: XML parsing can be very memory hogging when using a DOM based parser. Consider using a SAX or binary XML based parser (VTD-XML is a Java API based on that).

Actually, if the XML->JSON mapping is pure 1:1, then you can also consider to just read the XML and write the JSON realtime line by line using a little stack.


Back to the question: I suggest to use VisualVM for this. You can find here a blog article how to get it to work with Tomcat.

like image 67
BalusC Avatar answered Nov 14 '22 23:11

BalusC


You can use the profiler in jvisualvm in the JDK to do memory profiling.

Also have a look at Templates to cache the XSLT transformer.

http://java.sun.com/j2se/1.5.0/docs/api/javax/xml/transform/Templates.html

like image 41
Thorbjørn Ravn Andersen Avatar answered Nov 14 '22 23:11

Thorbjørn Ravn Andersen