I'm having a problem exporting a rather large namespace (~200 pages) with all children to HTML via the default export feature.
The goal is to get a fully browsable static offline site (exported documentation of our software package).
The XAR export is quick and successful, but not very useful for the end users...
The HTML export takes around 2 minutes and then fails logging java.lang.OutOfMemoryError: GC overhead limit exceeded into catalina.out.
When the exported HTML ZIP download fails, the file is only around 1MB, so I guess that the other part of the export resides still in RAM.
The instance is running with memory set to CATALINA_OPTS="$CATALINA_OPTS -Xmx1024m" via setenv.sh.
Prior the OutOfMemoryError, there are several other errors logged, such as errors in velocity templates etc.
There does not seem to be a specific page causing the problem. When divided into chunks of ~50 pages, all pages export fine.
I made it work eventually with increasing the RAM size from 1G to 2G
Here are the stats of my observations:
- Export all (~200 pages): Does not finish with 1024MB, 1536MB RAM
- Export all but largest space (~100 pages, 24MB unzipped): Finishes successfully with 1024MB RAM
- Export largest space (~100pages, 21MB unzipped): Does not finish with 1024MB
- Export all (~200 pages, 45MB unzipped): Finishes successfully with 2048MB RAM after tomcat restart
Within the largest namespace (~100 pages), all the pages are generated automatically from a template. They are not very large and only contain source code of database procedures+some description of the code. The longest will be somewhere near 1000 lines of SQL code, which is wrapped in code.
At the moment, I have no problem throwing a bit more RAM at the wiki server, so I will stick to this solution for the time being.
Workaround would be using WGET to create the export.