My approach was to create hundred thousand local collections and populate them with random strings, something like this:
SecureRandom random = new SecureRandom();
for(int i = 0 ; i < 100000 ; i++){
HashMap<String, String> map = new HashMap<String, String>();
for(int j = 0 ; j < 30 ; j++){
map.put(new BigInteger(130, random).toString(32), new BigInteger(130, random).toString(32));
}
}
I have provided -XX:+UseGCOverheadLimit jvm parameter too, but can not get the error. Is there any easy and reliable way/hack to get this error?
The "java. lang. OutOfMemoryError: GC overhead limit exceeded" error indicates that the NameNode heap size is insufficient for the amount of HDFS data in the cluster. Increase the heap size to prevent out-of-memory exceptions.
Prevention: If MaxMetaSpaceSize, has been set on the command line, increase its value. MetaSpace is allocated from the same address spaces as the Java heap. Reducing the size of the Java heap will make more space available for MetaSpace.
To modify the number of threads, use the flag-XX:ParallelGCThreads=#, where # is the number of threads (in our case, eight). Once you've enabled those flags, test the application and see how much performance you've gained. Ideally, your application should now run faster and have shorter GC pause times.
Since you haven't accepted any answer, I'll assume that none of them have worked for you. Here's one that will. But first, a review of the conditions that trigger this error:
The parallel collector will throw an OutOfMemoryError if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered
So, you have to consume almost all of the heap, keep it allocated, and then allocate lots of garbage. Putting lots of stuff into a Map
isn't going to do this for you.
public static void main(String[] argv)
throws Exception
{
List<Object> fixedData = consumeAvailableMemory();
while (true)
{
Object data = new byte[64 * 1024 - 1];
}
}
private static List<Object> consumeAvailableMemory()
throws Exception
{
LinkedList<Object> holder = new LinkedList<Object>();
while (true)
{
try
{
holder.add(new byte[128 * 1024]);
}
catch (OutOfMemoryError ex)
{
holder.removeLast();
return holder;
}
}
}
The consumeAvailableMemory()
method fills up the heap with relatively small chunks of memory. "Relatively small" is important because the JVM will put "large" objects (512k bytes in my experience) directly into the tenured generation, leaving the young generation empty.
After I've consumed most of the heap, I just allocate and discard. The smaller block size in this phase is important: I know that I'll have enough memory for at least one allocation, but probably not more than two. This will keep the GC active.
Running this produces the desired error in under a second:
> java -Xms1024m -Xmx1024m GCOverheadTrigger
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
at GCOverheadTrigger.main(GCOverheadTrigger.java:12)
And, for completeness, here's the JVM that I'm using:
> java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
And now my question for you: why in the world would you want to do this?
This:
HashMap<String, String> map = new HashMap<String, String>();
is scoped within the loop and there are no external (long-term) references to the map created as the loop iterates. Hence each map will be eligible for garbage collection at the end of each loop iteration.
You need to create a collection of objects outside the loop, and use the loop to populate that collection.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With