Uploaded image for project: 'XWiki Platform'
  1. XWiki Platform
  2. XWIKI-17535

Filter stream xar/1.2 export failing because of 4GB file limit

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 7.4
    • Fix Version/s: None
    • Component/s: Filter
    • Labels:
      None
    • Difficulty:
      Unknown
    • Similar issues:

      Description

      An export on a 7.4 wiki instance failed with the following error:

         <throwable class="org.xwiki.filter.FilterException">
            <detailMessage>Failed to close zip archive entry</detailMessage>
            <cause class="org.apache.commons.compress.archivers.zip.Zip64RequiredException">
              <detailMessage>Main/Communities/Affiliates/Affiliate_Status/WebHome.xml&apos;s size exceeds the limit of 4GByte.</detailMessage>
              <stackTrace>         <trace>org.apache.commons.compress.archivers.zip.ZipArchiveOutputStream.checkIfNeedsZip64(ZipArchiveOutputStream.java:632)</trace>         <trace>org.apache.commons.compress.archivers.zip.ZipArchiveOutputStream.handleSizesAndCrc(ZipArchiveOutputStream.java:619)</trace>       <trace>org.apache.commons.compress.archivers.zip.ZipArchiveOutputStream.closeArchiveEntry(ZipArchiveOutputStream.java:494)</trace>         <trace>org.xwiki.filter.xar.internal.output.XARWikiWriter.closeEntry(XARWikiWriter.java:148)</trace>       <trace>org.xwiki.filter.xar.internal.output.XAROutputFilterStream.endWikiDocumentLocale(XAROutputFilterStream.java:281)</trace>
      

      The page on which this error happened is a page with 294 versions and more than 100 attachments totaling around 100-150Mb.

      Looking at this error this would be linked to
      https://commons.apache.org/proper/commons-compress/apidocs/org/apache/commons/compress/archivers/zip/ZipArchiveOutputStream.html

      "As of Apache Commons Compress 1.3 it transparently supports Zip64 extensions and thus individual entries and archives larger than 4 GB or with more than 65536 entries in most cases but explicit control is provided via setUseZip64(org.apache.commons.compress.archivers.zip.Zip64Mode). If the stream can not use SeekableByteChannel and you try to write a ZipArchiveEntry of unknown size then Zip64 extensions will be disabled by default."

      It seems we would not be in the case with Zip64 extensions and we are blocked at 4Gb.

      The 4Gb limit seems surprising as the page in question got exported using another method to an XML file of 140Mb.

      In any case it would be important to have an improved error handling to avoid stopping the import on one specific page.

      The same wiki exported as a XAR from the standard import also failed.

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              ludovic Ludovic Dubost
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Date of First Response: