Uploaded image for project: 'LLM AI Integration'
  1. LLM AI Integration
  2. LLMAI-42

Don't query the models API when the model list is provided

    XMLWordPrintable

Details

    • Bug
    • Resolution: Fixed
    • Major
    • 0.3
    • 0.2.1
    • None
    • Unknown
    • N/A
    • N/A

    Description

      When the list of models is provided in the configuration, the LLM app shouldn't query the API for available models. This fixes compatibility with providers like https://app.fireworks.ai/ that don't provide the models endpoint like https://app.fireworks.ai/. This is also a first step for Azure support in LLMAI-23.

      Attachments

        Issue Links

          Activity

            People

              MichaelHamann Michael Hamann
              MichaelHamann Michael Hamann
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: