Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does 64 bit JVM throw Out Of Memory before xmx is reached?

I am wrestling with large memory requirements for a java app.

In order to address more memory I have switch to a 64 bit JVM and am using a large xmx. However, when the xmx is above 2GB the app seems to run out of memory earlier than expected. When running with an xmx of 2400M and looking at GC info from -verbosegc I get...

[Full GC 2058514K->2058429K(2065024K), 0.6449874 secs] 

...and then it throws an out of memory exception. I would expect it to increase the heap above 2065024K before running out of memory.

In a trivial example i have a test program that allocates memory in a loop and prints out information from Runtime.getRuntime().maxMemory() and Runtime.getRuntime().totalMemory() until it eventually runs out of memory.

Running this over a range of xmx values it appears that Runtime.getRuntime().maxMemory() reports about 10% less than xmx and that total memory will not grow beyond 90% of Runtime.getRuntime().maxMemory().

I am using the following 64bit jvm:

java version "1.6.0_26"
Java(TM) SE Runtime Environment (build 1.6.0_26-b03)
Java HotSpot(TM) 64-Bit Server VM (build 20.1-b02, mixed mode)

Here is the code:

import java.util.ArrayList;

public class XmxTester {


private static String xmxStr;

private long maxMem;
private long usedMem;
private long totalMemAllocated;
private long freeMem;


private ArrayList list;

/**
 * @param args
 */
public static void main(String[] args) {

xmxStr = args[0];
XmxTester xmxtester = new XmxTester();
}

public XmxTester() {

byte[] mem = new byte[(1024 * 1024 * 50)];

list = new ArrayList();
while (true) {
    printMemory();
    eatMemory();
}

}

private void eatMemory() {
// TODO Auto-generated method stub
byte[] mem = null;
try {
    mem = new byte[(1024 * 1024)];
} catch (Throwable e) {
    System.out.println(xmxStr + "," + ConvertMB(maxMem) + ","
        + ConvertMB(totalMemAllocated) + "," + ConvertMB(usedMem)
        + "," + ConvertMB(freeMem));

    System.exit(0);
}

list.add(mem);

}

private void printMemory() {
maxMem = Runtime.getRuntime().maxMemory();
freeMem = Runtime.getRuntime().freeMemory();
totalMemAllocated = Runtime.getRuntime().totalMemory();
usedMem = totalMemAllocated - freeMem;


}

double ConvertMB(long bytes) {

int CONVERSION_VALUE = 1024;

return Math.round((bytes / Math.pow(CONVERSION_VALUE, 2)));

}

}

I use this batch file to run it over multiple xmx settings. Its includes references to a 32 bit JVM, I wanted a comparison to a 32bit jvm - obviously this call fails as soon as xmx is larger than about 1500M

@echo off
set java64=<location of 64bit JVM>
set java32=<location of 32bit JVM>
set xmxval=64


:start


SET /a xmxval  = %xmxval% + 64

 %java64%  -Xmx%xmxval%m  -XX:+UseCompressedOops -XX:+DisableExplicitGC XmxTester %xmxval%

%java32% -Xms28m -Xmx%xmxval%m   XmxTester %xmxval%

if %xmxval% == 4500 goto end
goto start
:end
pause

This spits out a csv which when put into excel looks like this (apologies for my poor formatting here)

32 bit

XMX  max mem  total mem   free mem  %of xmx used before out of mem exception
128  127  127  125  2  98.4%
192  191  191  189  1  99.0%
256  254  254  252  2  99.2%
320  318  318  316  1  99.4%
384  381  381  379  2  99.5%
448  445  445  443  1  99.6%
512  508  508  506  2  99.6%
576  572  572  570  1  99.7%
640  635  635  633  2  99.7%
704  699  699  697  1  99.7%
768  762  762  760  2  99.7%
832  826  826  824  1  99.8%
896  889  889  887  2  99.8%
960  953  953  952  0  99.9%
1024  1016  1016  1014  2  99.8%
1088  1080  1080  1079  1  99.9%
1152  1143  1143  1141  2  99.8%
1216  1207  1207  1205  2  99.8%
1280  1270  1270  1268  2  99.8%
1344  1334  1334  1332  2  99.9%

64 bit

128  122  122  116  6  90.6%
192  187  187  180  6  93.8%
256  238  238  232  6  90.6%
320  285  281  275  6  85.9%
384  365  365  359  6  93.5%
448  409  409  402  6  89.7%
512  455  451  445  6  86.9%
576  512  496  489  7  84.9%
640  595  595  565  30  88.3%
704  659  659  629  30  89.3%
768  683  682  676  6  88.0%
832  740  728  722  6  86.8%
896  797  772  766  6  85.5%
960  853  832  825  6  85.9%
1024  910  867  860  7  84.0%
1088  967  916  909  6  83.5%
1152  1060  1060  1013  47  87.9%
1216  1115  1115  1068  47  87.8%
1280  1143  1143  1137  6  88.8%
1344  1195  1174  1167  7  86.8%
1408  1252  1226  1220  6  86.6%
1472  1309  1265  1259  6  85.5%
1536  1365  1317  1261  56  82.1%
1600  1422  1325  1318  7  82.4%
1664  1479  1392  1386  6  83.3%
1728  1536  1422  1415  7  81.9%
1792  1593  1455  1448  6  80.8%
1856  1650  1579  1573  6  84.8%
1920  1707  1565  1558  7  81.1%
1984  1764  1715  1649  66  83.1%
2048  1821  1773  1708  65  83.4%
2112  1877  1776  1769  7  83.8%
2176  1934  1842  1776  66  81.6%
2240  1991  1899  1833  65  81.8%
2304  2048  1876  1870  6  81.2%
2368  2105  1961  1955  6  82.6%
2432  2162  2006  2000  6  82.2%
like image 415
Al Quinn Avatar asked Oct 25 '11 05:10

Al Quinn


People also ask

Why JVM takes more memory than XMX?

Java Virtual Machine optimizes the code during runtime. Again, to know which parts to optimize it needs to keep track of the execution of certain code parts. So again, you are going to lose memory.

What happens when JVM runs out of memory which exception is thrown?

If the JVM is not able to allocate memory for the newly created objects an exception named OutOfMemoryError is thrown. This usually occurs when we are not closing objects for long time or, trying to act huge amount of data at once.

What is the max heap size for 64 bit JVM?

Max Heap Size. The maximum theoretical heap limit for the 32-bit and 64-bit JVM is easy to determine by looking at the available memory space, 2^32 (4 GB) for 32-bit JVM and 2^64 (16 Exabytes) for 64-bit JVM. In practice, due to various constraints, the limit can be much lower and varies given the operating system.

Why is my Java out of memory?

A java. lang. OutOfMemoryError usually means that something is wrong in the application - for example, the application code is referencing large objects for too long or trying to process large amounts of data at a time. The problems could also exist in third-party libraries used within an application.


1 Answers

Why does it happen?

Basically, there are two strategies that the JVM / GC can use to decide when to give up and throw an OOME.

  • It can keep going and going until there is simply not enough memory after garbage collection to allocate the next object.

  • It can keep going until the JVM is spending more than a given percentage of time running the garbage collector.

The first approach has the problem that for a typical application the JVM will spend a larger and larger percentage of its time running the GC, in an ultimately futile effort to complete the task.

The second approach has the problem that it might give up too soon.


The actual behaviour of the GC in this area is governed by JVM options (-XX:...). Apparently, the default behaviour differs between 32 and 64 bit JVMs. This kind of makes sense, because (intuitively) the "out of memory death spiral" effect for a 64 bit JVM will last longer and be more pronounced.


My advice would be to leave this issue alone. Unless you really need to fill every last byte of memory with stuff it is better for the JVM to die early and avoid wasting lots of time. You can then restart it with more memory and get the job done.

Clearly, your benchmark is atypical. Most real programs simply don't try to grab all of the heap. It is possible that your application is atypical too. But it is also possible that your application is suffering from a memory leak. If that is the case, you should be investigating the leak rather than trying to figure out why you can't use all of memory.


However my issue is mainly with why it does not honor my xmx setting.

It is honoring it! The -Xmx is the upper limit on the heap size, not the criterion for deciding when to give up.

I have set an XMX of 2432M but asking the JVM to return its understanding of max memory returns 2162M.

It is returning the max memory that it has used, not the max memory it is allowed to use.

Why does it 'think' the max memory is 11% less than the xmx?

See above.

Furthermore why when the heap hits 2006M does it not extend the heap to at least 2162 ?

I presume that it is because the JVM has hit the "too much time spent garbage collecting" threshold.

Does this mean in 64 bit JVMs one should fudge the XMX setting to be 11% higher than the intended maximum ?

Not in general. The fudge factor depends on your application. For instance, an application with a larger rate of object churn (i.e. more objects created and discarded per unit of useful work) is likely to die with an OOME sooner.

I can predict the requirments based on db size and have a wrapper that adjusts xmx, howeveri have the 11% problem whereby my montioring suggests the app needs 2 GB, so I set a 2.4GB xmx. however instead of having an expected 400MB of 'headroom' the jvm only allows the heap to grow to 2006M.

IMO, the solution is to simply add an extra 20% (or more) on top of what you are currently adding. Assuming that you have enough physical memory, giving the JVM a larger heap is going to reduce overall GC overheads and make your application run faster.

The other tricks that you could try is to set -Xmx and -Xms to the same value and adjusting the tuning parameter that sets the maximum "time spent garbage collecting" ratio.

like image 114
Stephen C Avatar answered Nov 15 '22 16:11

Stephen C