Uploaded image for project: 'XWiki Platform'
  1. XWiki Platform
  2. XWIKI-20264

Limit the amount of content that can be included in a single PDF export

    XMLWordPrintable

Details

    • Improvement
    • Resolution: Fixed
    • Major
    • 14.10
    • 14.9-rc-1
    • Export - PDF
    • None
    • Unit
    • Unknown

    Description

      When exporting multiple wiki pages to PDF we have to:

      1. render each wiki page to HTML
      2. aggregate all the results in a single HTML web page

      The first step is done in a low-priority background (daemon) thread on the server, so it shouldn't slow down the server, CPU-wise, but collecting the result of rendering each wiki page might require a lot of memory (because we need to keep also the XDOM trees in order to be able to compute the table of contents)

      The second step is done in a web browser (either a remote headless Chrome or the user's own web browser). If the generated HTML page is very large then the browser may have problems loading it and generating the PDF. This can possibly drain a lot of resources (CPU and memory). CPU is used intensively for instance when paginating the content (with paged.js).

      This means that users can kill their own web browser, or the one from the server (and possibly slow down XWiki) by performing large exports, either intentionally or by mistake. Ideally, we should enforce some limits.

      Attachments

        Issue Links

          Activity

            People

              mflorea Marius Dumitru Florea
              mflorea Marius Dumitru Florea
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: