Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why this happens all the time? Solr OutOfMemoryError GC overhead limit exceeded

I am doing a load test for my Solr application. The index has more than 200 million documents. I use the default Jetty server and set up the maximal JVM memory as 4GB. To test my app, I make up 5000 text queries and issue them to Solr one by one. However, after about 110 queries, the Jetty container throws an exception.

Why does this happen? How can I solve it?

SEVERE: java.lang.OutOfMemoryError: GC overhead limit exceeded
    at org.apache.lucene.util.AttributeImpl.clone(AttributeImpl.java:196)
    at org.apache.lucene.util.AttributeSource$State.clone(AttributeSource.java:116)
    at org.apache.lucene.util.AttributeSource$State.clone(AttributeSource.java:119)
    at org.apache.lucene.util.AttributeSource.captureState(AttributeSource.java:349)
    at org.apache.solr.highlight.TokenOrderingFilter.incrementToken(DefaultSolrHighlighter.java:595)
    at org.apache.lucene.search.highlight.OffsetLimitTokenFilter.incrementToken(OffsetLimitTokenFilter.java:43)
    at org.apache.lucene.analysis.CachingTokenFilter.fillCache(CachingTokenFilter.java:78)
    at org.apache.lucene.analysis.CachingTokenFilter.incrementToken(CachingTokenFilter.java:50)
    at org.apache.lucene.search.highlight.Highlighter.getBestTextFragments(Highlighter.java:225)
    at org.apache.solr.highlight.DefaultSolrHighlighter.doHighlightingByHighlighter(DefaultSolrHighlighter.java:468)
    at org.apache.solr.highlight.DefaultSolrHighlighter.doHighlighting(DefaultSolrHighlighter.java:379)
    at org.apache.solr.handler.component.HighlightComponent.process(HighlightComponent.java:116)
    at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:194)
    at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)
    at org.apache.solr.core.SolrCore.execute(SolrCore.java:1368)
    at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:356)
    at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:252)
    at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
    at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:399)
    at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
    at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
    at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
    at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
    at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
    at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
    at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
    at org.mortbay.jetty.Server.handle(Server.java:326)
    at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
    at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
    at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
    at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
    at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
like image 389
Xiao Avatar asked Aug 26 '12 01:08

Xiao


People also ask

What is GC overhead limit?

GC Overhead Limit Exceeded ErrorIt's thrown by the JVM when it encounters a problem related to utilizing resources. More specifically, the error occurs when the JVM spent too much time performing Garbage Collection and was only able to reclaim very little heap space.

How do I fix GC overhead limit exceeded in eclipse?

From the root of the Eclipse folder open the eclipse. ini and change the default maximum heap size of -Xmx256m to -Xmx1024m on the last line. NOTE: If there is a lot of memory available on the machine, you can also try using -Xmx2048m as the maximum heap size.

How can we avoid GC overhead limit exceeded error in Talend?

"GC overhead limit exceeded" message is something which cannot be truly removed by increasing the available memory. Rather GC should be put into a different mode (perhaps event different than suggested by me) to handle the situation properly.


1 Answers

Obviously 4Gb RAM is very low to handle load test on 200M index. We made performance testing for Solr 4.2 on 300M documents with average document size of 1K. The goal was to figure out minimal machine configuration on which we can have stable response time < 3 sec for non-faceted queries. For 100 concurrent queries our result showed minimal machine configuration was 8 CPU cores / 15Gb RAM. Of course results will vary depending on many factors, however you can use this as rule of thumb for your machine sizing.

like image 124
Vladimir Kroz Avatar answered Oct 17 '22 05:10

Vladimir Kroz