Details
-
Bug
-
Resolution: Fixed
-
Major
-
0.2.1
-
None
-
Unknown
-
N/A
-
N/A
-
Description
When the list of models is provided in the configuration, the LLM app shouldn't query the API for available models. This fixes compatibility with providers like https://app.fireworks.ai/ that don't provide the models endpoint like https://app.fireworks.ai/. This is also a first step for Azure support in LLMAI-23.
Attachments
Issue Links
- blocks
-
LLMAI-23 Support OpenAI hosted by Azure
- Open