Uploaded image for project: 'XWiki Platform'
  1. XWiki Platform
  2. XWIKI-13939

Serializing solr indexer job progress on big wikis takes way too much memory

    XMLWordPrintable

Details

    • Unknown
    • N/A
    • N/A

    Description

      Seems to be the root cause of what takes myxwiki.org down most of the time.

      Pretty much all the memory goes in a map XStream is using to remember everything it's serializing. In practice the map contains tons of duplicates of com.thoughtworks.xstream.io.path.Path instances totally useless for most of them. In my case it's a Solr indexed job, not sure if it's specific to this job or if it's progress related in general.

      Looks like a bug, from what I understand this map is here to remember unique objects and there is no reason it contains so many duplicates. Or maybe I don't read Yourkit properly and it's not really duplicates, it's just that the progress usually contains an insane number of object and XStream is really not designed to support this properly.

      The worst is that this whole map is totally useless in case of job progress since all serialized objects are unique, no need for any optimization to get smaller XML like XStream does by default.

      Attachments

        Issue Links

          Activity

            People

              tmortagne Thomas Mortagne
              tmortagne Thomas Mortagne
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: