I found that one of my spring boot project's memory (RAM consumption) is increasing day by day. When I uploaded the jar file to the AWS server, it was taking 582 MB of RAM (Max Allocated RAM is 1500 MB), but each day, the RAM is increasing by 50MB to 100 MB and today after 5 days, it's taking 835 MB. Right now the project is having 100-150 users and with normal usage of Rest APIs.
Because of this increase in the RAM, couple of times the application went down with the following error (error found from the logs):
Exception in thread "http-nio-3384-ClientPoller" java.lang.OutOfMemoryError: Java heap space
So to resolve this, I found that by using JAVA Heap Dump, I can find the objects/classes that are taking the memory. So by using Jmap
in the command line, I've created a heap dump and uploaded it to Heap Hero and Eclipse Memory Analyzer Tool. In both of them I found the following:
1. Total Waste memory is: 64.69MB (73%) (check below screenshot)
2. Out of these, 34.06MB is taken by Byte [] array
and LinkedHashmap[]
(check below screenshot), which I have never used in my whole project. I searched for it in my project but didn't found.
3. Following 2 large objects taking 32 MB and 20 MB respectively.
1. Java Static io.netty.buffer.ByteBufUtil.DEFAULT_ALLOCATOR
2. Java Static com.mysql.cj.jdbc.AbandonedConnectionCleanupThread.connectionFinalizerPhantomRefs`
So I tried to find this netty.buffer. in my project, but I don't find anything which matched with netty or buffer.
Now my question is how can I reduce this memory leak or how can I find the exact memory consumption objects/class/variable so that I can reduce the heap size.
I know few of the experts will ask for the source code or anything similar to that but I believe that from the heap dump we can find the memory leak or live objects that are available in the memory. I am looking for that option or anything that reduces this heap dump!
I am working on this issue for the past 3 weeks. Any help would be appreciated. Thank you!
If you are trying to test memory leakage in your application all you have to do is to identify the object which gets accumulated in the heap every time you access the application. To effectively identify the objects first take and initial heap. Then perform some action in your application.
If you have a heap dump file saved on your local system, you can open the file in Java VisualVM by choosing File > Load from the main menu. Java VisualVM can open heap dumps saved in the . hprof file format. When you open a saved heap dump, the heap dump opens as a tab in the main window.
Open the heap dump in Eclipse Memory Analyzer using the option File --> Open Heap Dump. First, it will prompt you to create a leak suspect report. The user can create it or skip it. The "overview" tab of the memory analyzer will show the total size of the heap and a pie chart of object size.
What is the heap dump file size? The heap dump file size is actually 382 MiB.
Start with enabling the JVM native memory tracker to get an idea which part of the memory is increasing by adding the flag -XX:NativeMemoryTracking=summary
. There is some performance overhead according to the documentation (5-10%), but if this isn't a issue I would recommend running the JVM with this flag enabled even in production.
Then you can check the values using jcmd <PID> VM.native_memory
(there's a good writeup in this answer: Java native memory usage)
If there is indeed a big chunk of native memory allocated, it's likely this is allocated by Netty.
How do you run your application in AWS? If it's running in a Docker image, you might have stumbled upon this issue: What would cause a java process to greatly exceed the Xmx or Xss limit?
If this is the case, you may need to set the environment variable MALLOC_ARENA_MAX
if your application is using native memory (which Netty does) and running on a server with a large number of cores. It's perfectly possible that the JVM allocates this memory for Netty but doesn't see any reason to release it, so it will appear to only continue to grow.
If you want to control how much native memory can be allocated by Netty, you can use the JVM flag -XX:MaxDirectMemorySize
for this (I believe the default is the same value as Xmx
) and lower it in case you application doesn't require that much memory.
JVM memory tuning is a complex process and it becomes even more complex when native memory is involved - as the linked answer shows it's not as easy as simply setting the Xms
and Xmx
flag and expecting that no more memory will be used.
Heap dump is not enough to detect memory leaks. You need to look at the difference of two consecutive heaps snapshots both taken after calling the GC. Or you need a profiling tool that can give the generations count for each class. Then you should only look at your domain objects (not generic objects like bytes or strings ...etc) that survived the GC and passed from the old snapshot to the new one. Or, if using the profiling tool, look for old domain objects that still alive and growing for many generations.
Having objects lived for many generations and keeps growing means those objects are still refernced and the GC is not able to reclaim them. However, living for many generations alone is not enough to cause a leak because cached or static Objects may stay for many generations. The other important factor is that they keep growing.
After you detected what object is being leaked, you may use heap dumb to analyse those objects and get the references.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With