Uploaded image for project: 'LLM AI Integration'
  1. LLM AI Integration
  2. LLMAI-93

The LLM chat doesn't work when XWiki is the root application

    XMLWordPrintable

Details

    • Bug
    • Resolution: Fixed
    • Major
    • 0.5.3
    • 0.4
    • None
    • Unknown

    Description

      Steps to reproduce:

      1. Install the LLM application on an XWiki instance that is configured as root application, e.g., the standard Docker image of XWiki.
      2. Configure some models.
      3. Open the chat UI by clicking on the "LLM Chat" button at the bottom right of the page.

      Expected result:

      The configured models are listed.

      Actual result:

      No models are listed. Looking at the network tab of the developer tools, one can see that a request to /xwiki/rest/wikis/xwiki/aiLLM/v1/models?media=json failed with status 404 - the /xwiki/ prefix is invalid. The correct URL would have been /rest/wikis/xwiki/aiLLM/v1/models.

      Attachments

        Activity

          People

            ppantiru Paul Pantiru
            MichaelHamann Michael Hamann
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: