Details
-
Bug
-
Resolution: Unresolved
-
Major
-
None
-
6.0.1
-
32GB RAM
Solaris 11.1
JDK 1.7.0_07-b10
-
Unknown
-
Description
Hello,
when uploading a lot of pages into XWiki 6.0.1 enterprise which is configured to run inside tomcat and use postgresql as a backend I'm still very easily able to nearly kill whole xwiki by uploading few thousands of small pages. It's java basically killed by various OOM errors thrown from random places.
If you do have some time to duplicate, then create a content.xml with
<?xml version="1.0" encoding="UTF-8" standalone="yes"?> <page xmlns="http://www.xwiki.org"> <title>Semantic XWiki Benchmark page</title> <syntax>xwiki/2.0</syntax> <content>This is a page content.</content> </page>
and then upload pages with:
$ curl -X PUT -H "Content-Type: application/xml" -d @content.xml -o /dev/null http://Admin:admin@localhost:8080/xwiki/rest/wikis/xwiki/spaces/Benchmark/pages/BenchPage_[1-500000]
it takes some time (hour(s)) to get into OOM so be patient. If this does not work for you, use higher number of pages or run in parallel with:
#!/usr/bin/env bash echo "starting curl 1" curl -s -X PUT -H "Content-Type: application/xml" -d @content.xml -o /dev/null http://Admin:admin@localhost:8080/xwiki/rest/wikis/xwiki/spaces/Benchmark/pages/BenchPage_[1-125000] & echo "starting curl 2" curl -s -X PUT -H "Content-Type: application/xml" -d @content.xml -o /dev/null http://Admin:admin@localhost:8080/xwiki/rest/wikis/xwiki/spaces/Benchmark/pages/BenchPage_[125001-250000] & echo "starting curl 3" curl -s -X PUT -H "Content-Type: application/xml" -d @content.xml -o /dev/null http://Admin:admin@localhost:8080/xwiki/rest/wikis/xwiki/spaces/Benchmark/pages/BenchPage_[250001-375000] & echo "starting curl 4" curl -s -X PUT -H "Content-Type: application/xml" -d @content.xml -o /dev/null http://Admin:admin@localhost:8080/xwiki/rest/wikis/xwiki/spaces/Benchmark/pages/BenchPage_[375000-500000] & echo "waiting for jobs to complete..." wait
Of course, you will probably tell me that I shall increase java's heap limit, then believe me, I've duplicated this with:
CATALINA_OPTS="-d64 -server -Xms2048m -Xmx2048m -XX:MaxPermSize=196m -Dfile.encoding=utf-8 -Djava.awt.headless=true -XX:+UseParallelGC -XX:MaxGCPauseMillis=100"
export CATALINA_OPTS
in setenv.sh in catalina's bin directory and also with 8GB RAM setup confidure with:
CATALINA_OPTS="-d64 -server -Xms8192m -Xmx8192m -XX:MaxPermSize=196m -Dfile.encoding=utf-8 -Djava.awt.headless=true -XX:+UseParallelGC -XX:MaxGCPauseMillis=100"
export CATALINA_OPTS
everything is running on 32GB RAM box with Solaris 11.1 OS and java 1.7.0_07-b10
The trace from one run of the issue looks:
Exception in thread "RMI TCP Connection(idle)" java.lang.OutOfMemoryError: Java heap space at java.io.BufferedInputStream.<init>(BufferedInputStream.java:195) at java.io.BufferedInputStream.<init>(BufferedInputStream.java:175) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:697) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:667) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:722) Exception in thread "ajp-bio-8009-AsyncTimeout" java.lang.OutOfMemoryError: Java heap space Exception in thread "ContainerBackgroundProcessor[StandardEngine[Catalina]]" java.lang.OutOfMemoryError: Java heap space at java.util.concurrent.ConcurrentHashMap$KeySet.iterator(ConcurrentHashMap.java:1427) at java.util.AbstractCollection.toArray(AbstractCollection.java:179) at org.apache.catalina.session.StandardSession.keys(StandardSession.java:1765) at org.apache.catalina.session.StandardSession.expire(StandardSession.java:863) at org.apache.catalina.session.StandardSession.isValid(StandardSession.java:656) at org.apache.catalina.session.ManagerBase.processExpires(ManagerBase.java:532) at org.apache.catalina.session.ManagerBase.backgroundProcess(ManagerBase.java:517) at org.apache.catalina.core.ContainerBase.backgroundProcess(ContainerBase.java:1352) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1530) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1540) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1540) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.run(ContainerBase.java:1519) at java.lang.Thread.run(Thread.java:722) Exception in thread "Lucene Index Updater" java.lang.OutOfMemoryError: Java heap space Jun 26, 2014 7:10:26 PM ServerCommunicatorAdmin reqIncoming WARNING: The server has decided to close this client connection. 2014-06-26 19:10:31,992 [XWiki Solr index thread] ERROR o.a.s.c.SolrCore - java.lang.NullPointerException at org.apache.catalina.loader.WebappClassLoader.findResourceInternal(WebappClassLoader.java:3119) at org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2892) at org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1210) at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1690) at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1571) at org.apache.lucene.index.DocumentsWriter.doFlush(DocumentsWriter.java:518) at org.apache.lucene.index.DocumentsWriter.flushAllThreads(DocumentsWriter.java:618) at org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2891) at org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:3049) at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3016) at org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:578) at org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:95) at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:64) at org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalCommit(DistributedUpdateProcessor.java:1458) at org.apache.solr.update.processor.DistributedUpdateProcessor.processCommit(DistributedUpdateProcessor.java:1435) at org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:69) at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:68) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1916) at org.apache.solr.client.solrj.embedded.EmbeddedSolrServer.request(EmbeddedSolrServer.java:150) at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:118) at org.apache.solr.client.solrj.SolrServer.commit(SolrServer.java:168) at org.apache.solr.client.solrj.SolrServer.commit(SolrServer.java:146) at org.xwiki.search.solr.internal.AbstractSolrInstance.commit(AbstractSolrInstance.java:101) at org.xwiki.search.solr.internal.DefaultSolrIndexer.commit(DefaultSolrIndexer.java:441) at org.xwiki.search.solr.internal.DefaultSolrIndexer.processBatch(DefaultSolrIndexer.java:429) at org.xwiki.search.solr.internal.DefaultSolrIndexer.runInternal(DefaultSolrIndexer.java:374) at com.xpn.xwiki.util.AbstractXWikiRunnable.run(AbstractXWikiRunnable.java:131) at java.lang.Thread.run(Thread.java:722) Exception in thread "XWiki Solr index thread" java.lang.NullPointerException at org.apache.catalina.loader.WebappClassLoader.findResourceInternal(WebappClassLoader.java:3119) at org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2892) at org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1210) at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1690) at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1571) at ch.qos.logback.classic.spi.LoggingEvent.<init>(LoggingEvent.java:121) at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:440) at ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:396) at ch.qos.logback.classic.Logger.error(Logger.java:559) at org.xwiki.search.solr.internal.DefaultSolrIndexer.commit(DefaultSolrIndexer.java:443) at org.xwiki.search.solr.internal.DefaultSolrIndexer.processBatch(DefaultSolrIndexer.java:429) at org.xwiki.search.solr.internal.DefaultSolrIndexer.runInternal(DefaultSolrIndexer.java:374) at com.xpn.xwiki.util.AbstractXWikiRunnable.run(AbstractXWikiRunnable.java:131) at java.lang.Thread.run(Thread.java:722) Exception in thread "Lucene Merge Thread #10103" org.apache.lucene.index.MergePolicy$MergeException: java.lang.OutOfMemoryError: Java heap space at org.apache.lucene.index.ConcurrentMergeScheduler.handleMergeException(ConcurrentMergeScheduler.java:545) at org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:518) Caused by: java.lang.OutOfMemoryError: Java heap space
but as I said, it's quite random. On xwiki-devs@ mailing list I've already reported several others (notifier was also considered to be a culprit here).
Thanks!
Karel
Attachments
Issue Links
- relates to
-
XINFRA-157 SOLR not available on myxwiki.org
- Closed