Uploaded image for project: 'LLM AI Integration'
  1. LLM AI Integration
  2. LLMAI-123

The application cannot be used in subwikis

    XMLWordPrintable

Details

    • Bug
    • Resolution: Fixed
    • Major
    • 0.7.1
    • 0.7
    • None
    • Unknown
    • N/A
    • N/A

    Description

      Steps to reproduce:

      1. Install the application in a subwiki (you first need to install the indexing API on the farm and then the UI on the subwiki).
      2. Configure a collection and a model that uses it
      3. Try using the model

      Expected result:

      The model can be used.

      Actual result:

      The model isn't listed and even when allowing guests, only the chat on the LLM application main page lists it but no context is used regardless how the permissions on the context are configured.

      Attachments

        Activity

          People

            MichaelHamann Michael Hamann
            MichaelHamann Michael Hamann
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: