I have long cycles of GC.
from checks I saw there are too many objects in the tenured (old) area of the Heap.
Is there any utitlity to know which objects are in which area of the heap, or any staticstics about these objects.
I am using Sun/Oracle HotSpot JVM (Java 6).
EDIT: little bit more details about my problem:
I have a big Heap (32GB) and it looks like even when the Heap old area is only 30% full, running GC manually make pauses of 15 sec's. I want to know which objects are the "survivors" that remains in the old area, in order to know which object creation to optimize.
I'm not aware of any tool / utility that works with current generation JVMs.
But the flip-side is that I don't see how such a utility would be helpful.
Long GC times typically occur because your heap is too full. As the heap approaches 100% full, the amount of time spent in the GC tends to grow exponentially. In the worst case, the heap fills completely and your application gets an OutOfMemoryError
. There are two possible solutions:
If the root cause is that the heap is too small (for the size of problem that your application is trying to solve) then either increase the heap size, or look for a way to reduce the application's working set; i.e. the number / size of objects that it needs to have "live" during the computation.
If the root cause is a memory leak, then find and fix it.
In both cases, using a memory profiler will help you analyse the problem. But you don't need to know which objects are in the old generation. It is not relevant to either the root cause of the problem or the solution to the problem.
I want to know which objects are the "survivors" that remains in the old area, in order to know which object creation to optimize.
This is starting to make a bit more sense. It sounds like you need to find out which objects are long-lived ... rather than specifically which space they live in. You could possibly do that by using jhat
to compare a sequence of heap snapshots. (There may be a better way ...)
However, I still don't think this approach will help. The problem is that a full GC needs to traverse all reachable (hard,soft,weak,phantom) objects. And if you've got a 32Gb heap that is 30% full you've still got a lot of objects to mark/sweep/relocate. I think the solution is likely to be to use a concurrent collector and tune it so that it can keep up with your application's object allocation rate.
It also sounds like you might be calling System.gc()
directly from your code. Don't do that! Calling System.gc()
will (typically) cause the JVM to do a full garbage collection. That is pretty much guaranteed to give you a pause. It is much better to leave the JVM to decide when to run the collector.
Finally, it is unclear what you mean by "optimizing object creation". Do you mean reducing the object creation rate? Or are you thinking of something else to manage the retention of long-lived (cached?) objects?
The tool elephant track can profile the object creation and death, elephantTrack. With this, you may be able to figure out what kind of objects survive longer than expected, and with the method history, it may be possible to get exact objects.
But large heap means very long profile time and very large profile file.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With