Uploaded image for project: 'LLM AI Integration'
  1. LLM AI Integration
  2. LLMAI-96

Separate the internal inference engine into a separate extension

    XMLWordPrintable

Details

    • Improvement
    • Resolution: Fixed
    • Major
    • 0.5.1
    • 0.5
    • None
    • Unknown
    • N/A

    Description

      The internal inference engine should be moved into a separate extension in order to allow installing the extension on versions of XWiki that are affected by XCOMMONS-3088. Further, this makes sense as not everybody needs the new dependencies that are quite heavy (120 MB for PyTorch alone).

      Attachments

        Activity

          People

            MichaelHamann Michael Hamann
            MichaelHamann Michael Hamann
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: