We're having an issue with a JVM which is using a lot more than the max -Xmx.
We're using -Xmx2048 and it's currently using 17GB of OS memory.
I realise that the JVM can use use more than the max heap, but ours is using 15GB more which seems crazy.
The top dump looks like this:
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
30477 root 20 0 22.683g 0.017t 18536 S 0.7 29.1 27:43.44 java -Xmx2048M -Xms1024M -XX:MaxPermSize=256M -XX:ReservedCodeCacheSize=128M ....
(Note it's using 0.017TB or 17GB).
I've tried using jmap to debug:
sudo jmap -J-d64 -heap 30477
Attaching to process ID 30477, please wait...
Debugger attached successfully.
Server compiler detected.
JVM version is 24.75-b04
using thread-local object allocation.
Parallel GC with 8 thread(s)
Heap Configuration:
MinHeapFreeRatio = 0
MaxHeapFreeRatio = 100
MaxHeapSize = 2147483648 (2048.0MB)
NewSize = 1310720 (1.25MB)
MaxNewSize = 17592186044415 MB
OldSize = 5439488 (5.1875MB)
NewRatio = 2
SurvivorRatio = 8
PermSize = 21757952 (20.75MB)
MaxPermSize = 268435456 (256.0MB)
G1HeapRegionSize = 0 (0.0MB)
Heap Usage:
PS Young Generation
Eden Space:
capacity = 316145664 (301.5MB)
used = 150828136 (143.8409194946289MB)
free = 165317528 (157.6590805053711MB)
47.708431009827166% used
From Space:
capacity = 20447232 (19.5MB)
used = 3406624 (3.248809814453125MB)
free = 17040608 (16.251190185546875MB)
16.660563151041668% used
To Space:
capacity = 21495808 (20.5MB)
used = 0 (0.0MB)
free = 21495808 (20.5MB)
0.0% used
PS Old Generation
capacity = 716177408 (683.0MB)
used = 516302952 (492.3848648071289MB)
free = 199874456 (190.6151351928711MB)
72.09148825873044% used
PS Perm Generation
capacity = 56098816 (53.5MB)
used = 55842840 (53.255882263183594MB)
free = 255976 (0.24411773681640625MB)
99.54370516482915% used
15065 interned Strings occupying 1516808 bytes.
And also...
sudo jmap -J-d64 -permstat 30477
Attaching to process ID 30477, please wait...
Debugger attached successfully.
Server compiler detected.
JVM version is 24.75-b04
finding class loader instances ..done.
computing per loader stat ..done.
please wait.. computing liveness...liveness analysis may be inaccurate ...
class_loader classes bytes parent_loader alive? type
<bootstrap> 1908 11793728 null live <internal>
0x0000000780b97590 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b99b28 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b96cb0 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780113ca0 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b97c90 1 3048 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780114458 1 3048 null dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b892b0 1 3024 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b989a8 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780019cf0 7731 44085000 0x0000000780019d40 live sun/misc/Launcher$AppClassLoader@0x0000000770239a20
0x0000000780113f10 1 3152 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780aa8918 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780114008 1 3040 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b7a8b8 1 3024 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780d236d0 1 3048 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780019d40 207 1327296 null live sun/misc/Launcher$ExtClassLoader@0x00000007701eb8b8
0x0000000780113e28 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x00000007801140b0 1 3064 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x000000078dc814f8 1 3024 null dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b8abf8 1 3208 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b990a8 1 3048 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780114130 1 1904 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b98010 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x00000007801141b0 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780113620 27 86320 0x0000000780019cf0 live org/apache/tapestry5/internal/plastic/PlasticClassLoader@0x000000077181b058
0x0000000780114418 1 3064 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b98d28 1 3048 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b89270 1 3024 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780113ed0 1 3024 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b97910 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b997a8 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b96930 1 3048 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780d36d38 1 3024 0x0000000780019d40 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b99ea8 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780bddff8 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780a63ca0 1 3024 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780114048 1 3048 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780113fc8 1 3024 null dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780113de8 1 3072 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b7afb8 1 3040 null dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780113e68 1 1904 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b7ac38 1 1880 null dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780113d58 1 3064 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x00000007801140f0 1 3280 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b98628 1 3048 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x000000078088f900 0 0 0x0000000780019cf0 dead java/util/ResourceBundle$RBClassLoader@0x00000007709cda40
0x0000000780b99428 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780d28ba0 1 3024 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b97030 1 3048 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780114170 1 3064 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x00000007801143d8 1 3048 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780114240 1 1880 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
0x0000000780b8af60 1 3104 0x0000000780019cf0 dead sun/reflect/DelegatingClassLoader@0x0000000770059770
total = 53 9921 57418080 N/A alive=4, dead=49 N/A
The app is a Play app which I know does weird things with classloaders, but I think the -permstat should account for that, so I'm stumped.
Any ideas?
John
This is because the JVM steadily increases heap usage percentage until the garbage collection process frees up memory again. High heap usage occurs when the garbage collection process cannot keep up. An indicator of high heap usage is when the garbage collection is incapable of reducing the heap usage to around 30%.
But in 99% of the cases it is completely normal behaviour by the JVM. What you have specified via the -Xmx switches is limiting the memory consumed by your application heap. But besides the memory consumed by your application, the JVM itself also needs some elbow room.
An interesting area when sizing the JVM is the JIT's code cache. By default, the HotSpot JVM will use up to 240MB. If the code cache is too small the JIT will run out of space to store its output and performance will suffer as a result. If the cache is too large, memory may be wasted.
For 64 bit platforms and Java stacks in general, the recommended Maximum Heap range for WebSphere Application Server, would be between (4096M - 8192M) or (4G - 8G).
Note that Heap Area (bounded by Xmx) is only a part of the JVM process.
Basically memory occupied by the JVM is a sum of:
And possibly something more, you can find it by googling:)
So You have to check all those areas to ensure what is consuming memory.
I bet that you have simple thread leak - thousands of threads can eat up a lot of memory (as I said, assuming 1MB per thread, 10000 threads corresponds to 10GB of memory). You could simply make a thread dump (jstack -l > dumpfile.txt) to see if it is an issue.
Of course it could be also external library that use native buffers or JNI, but it is less probable.
So we figured this out, and it was just developer error. Thanks to all that suggested help.
I'm answering this anyway in case it helps anyone in the future.
The issue was simply not closing a stream open on a resource read using Class.getInputAsStream().
The confusion arose because the JVM heap usage was negligible, but each unclosed stream somehow left some native code allocated and not released.
Because there was no file handle leak (presumably because we're reading from the main JAR file each time, it was re-using the same handle), and because the JVM heap usage was tiny, we didn't spot it until we actually did a heap dump.
Anyway, the moral of the story is don't make the mistake I did which was assuming that if the JVM heap usage is low, there is no point looking at the heap dump, because in this case seeing the thousands of objects still allocated gave us a clue.
Thanks all!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With