Since you haven't accepted any answer, I'll assume that none of them have worked for you. Here's one that will. But first, a review of the conditions that trigger this error:
The parallel collector will throw an OutOfMemoryError if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered
So, you have to consume almost all of the heap, keep it allocated, and then allocate lots of garbage. Putting lots of stuff into a Map
isn't going to do this for you.
public static void main(String[] argv)
throws Exception
{
List<Object> fixedData = consumeAvailableMemory();
while (true)
{
Object data = new byte[64 * 1024 - 1];
}
}
private static List<Object> consumeAvailableMemory()
throws Exception
{
LinkedList<Object> holder = new LinkedList<Object>();
while (true)
{
try
{
holder.add(new byte[128 * 1024]);
}
catch (OutOfMemoryError ex)
{
holder.removeLast();
return holder;
}
}
}
The consumeAvailableMemory()
method fills up the heap with relatively small chunks of memory. "Relatively small" is important because the JVM will put "large" objects (512k bytes in my experience) directly into the tenured generation, leaving the young generation empty.
After I've consumed most of the heap, I just allocate and discard. The smaller block size in this phase is important: I know that I'll have enough memory for at least one allocation, but probably not more than two. This will keep the GC active.
Running this produces the desired error in under a second:
> java -Xms1024m -Xmx1024m GCOverheadTrigger
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
at GCOverheadTrigger.main(GCOverheadTrigger.java:12)
And, for completeness, here's the JVM that I'm using:
> java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
And now my question for you: why in the world would you want to do this?