Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Burst memory usage in Java

Tags:

I am trying to get a handle on proper memory usage and garbage collection in Java. I'm not a novice programmer by any means, but it always seems to me that once Java touches some memory, it will never be released for other applications to use. In that case, you have to make sure your peak memory is never too high, or your application will continually use whatever the peak memory usage was.

I wrote a small sample program trying to demonstrate this. It basically has 4 buttons...

  1. Fill class scope variable BigList = new ArrayList<string>() with about 25,000,000 long string items.
  2. Call BigList.clear()
  3. Reallocate the list - BigList = new ArrayList<string>() again (to shrink the list size)
  4. A call to System.gc() - Yes, I know this doesn't mean that GC will really run, but it's what we have.

So next I did some testing on Windows, Linux, and Mac OS while using the default task monitors to check on the processes reported memory usage. Here is what I found...

  • Windows - Pumping the list, calling clear, and then calling GC several times will not reduce memory usage at all. However, reallocating the list using new and then calling GC several times will reduce the memory usage back to starting levels. IMO, this is acceptable.
  • Linux (I used Mint 11 distro with Sun JVM) - Same results as Windows.
  • Mac OS - I followed the sames steps as above, but even when reinitializing the list calls to GC seemingly have no effect. The program will sit using hundreds of MB of RAM even though I have nothing in memory.

Can anyone explain this to me? Some people have told me some stuff about "heap" memory, but I still don't fully understand it and I'm not sure it applies here. From what I have heard about it, I shouldn't be seeing the behavior I am on Windows and Linux anyways.

Is this just a difference in the way Mac OS's Activity Monitor measures memory usage or is there something else going on? I would prefer to not have my program idling with tons of RAM usage. Thanks for your insight.

like image 775
jocull Avatar asked Nov 25 '11 05:11

jocull


People also ask

How do I check Java memory usage?

If you are looking specifically for memory in JVM: Runtime runtime = Runtime. getRuntime(); NumberFormat format = NumberFormat. getInstance(); StringBuilder sb = new StringBuilder(); long maxMemory = runtime.

What causes high heap memory usage?

High heap usage occurs when the garbage collection process cannot keep up. An indicator of high heap usage is when the garbage collection is incapable of reducing the heap usage to around 30%.

How do I use less memory in Java?

Set the Heap Size The most obvious place to start tuning the memory footprint is the Java heap size. If you reduce the Java heap size by a certain amount you will reduce the memory footprint of the Java process by the same amount. You can however not reduce the Java heap size infinitely.

How do I monitor Java heap space?

The easy way to monitor Heap usage is by using a commercial APM (Application Performance management tool) such as CA Wily APM, AppDynamics, New Relic, Riverbed, etc. APM tools not only monitor the heap usage, but you can also configure the tool to Alert you when Heap usage is not normal.


2 Answers

The Sun/Oracle JVM does not return unneeded memory to the system. If you give it a large, maximum heap size, and you actually use that heap space at some point, the JVM won't give it back to the OS for other uses. Other JVMs will do that (JRockit used to, but I don't think it does any more).

So, for Oracles JVM you need to tune your app and your system for peak usage, that's just how it works. If the memory that you're using can be managed with byte arrays (such as working with images or something), then you can use mapped byte buffers instead of Java byte arrays. Mapped byte buffers are taken straight from the system, and are not part of the heap. When you free up these objects (AND they are GC'd, I believe, but not sure), the memory will be returned to the system. You'll likely have to play with that one assuming it's even applicable at all.

like image 163
Will Hartung Avatar answered Oct 22 '22 16:10

Will Hartung


... but it always seems to me that once Java touches some memory, it's gone forever. You will never get it back.

It depends on what you mean by "gone forever".

I've also heard it said that some JVMs do give memory back to the OS when they are ready and able to. Unfortunately, given the way that the low-level memory APIs typically work, the JVM has to give back entire segments, and it tends to be complicated to "evacuate" a segment so that it can be given back.

But I wouldn't rely on that ... because there are various things that could prevent the memory being given back. The chances are that the JVM won't give the memory back to the OS. But it is not "gone forever" in the sense that the JVM will continue to make use of it. Even if the JVM never approaches the peak usage again, all of that memory will help to make the garbage collector run more efficiently.

In that case, you have to make sure your peak memory is never too high, or your application will continually eat up hundreds of MB of RAM.

That is not true. Assuming that you are adopting the strategy of starting with a small heap and letting it grow, the JVM won't ask for significantly more memory than the peak memory. The JVM won't continually eat up more memory ... unless your application has a memory leak and (as a result) its peak memory requirement has no bound.

(The OP's comments below indicate that this is not what he was trying to say. Even so, it is what he did say.)


On the topic of garbage collection efficiency, we can model the cost of a run of an efficient garbage collector as:

cost ~= (amount_of_live_data * W1) + (amount_of_garbage * W2) 

where W1 and W2 are (we assume) constants that depend on the collector. (Actually, this is an over-simplification. The first part is not a linear function of the number of live objects. However, I claim that it doesn't matter for the following.)

The efficiency of the collector can then be stated as:

efficiency = cost / amount_of_garbage_collected 

which (if we assume that the GC collects all data) expands to

efficiency ~= (amount_of_live_data * W1) / amount_of_garbage + W2. 

When the GC runs,

heap_size ~= amount_of_live_data + amount_of_garbage 

so

efficiency ~= W1 * (amount_of_live_data / (heap_size - amount_of_live_data) )               + W2. 

In other words:

  • as you increase the heap size, the efficiency tends to a constant (W2), but
  • you need a large ratio of heap_size to amount_of_live_data for this to happen.

The other point is that for an efficient copying collector, W2 covers just the cost of zeroing the space occupied by the garbage objects in 'from space'. The rest (tracing, copying of live objects to 'to space", and zeroing the 'from space' that they occupied) is part of the first term of the initial equation; i.e. covered by W1. What this means is that W2 is likely to be considerably smaller than W1 ... and that the first term of the final equation is significant for longer.

Now obviously this is a theoretical analysis, and the cost model is a simplification of how real garbage collectors really work. (And it doesn't take account of the "real" work that the application is doing, or the system-level effects of tying down too much memory.) However, the maths tells me that from the standpoint of GC efficiency, a big heap really does help a lot.

like image 41
Stephen C Avatar answered Oct 22 '22 17:10

Stephen C