Uploaded image for project: 'LLM AI Integration'
  1. LLM AI Integration
  2. LLMAI-106

Add support for embedding model prompt prefixes

    XMLWordPrintable

Details

    • New Feature
    • Resolution: Fixed
    • Major
    • 0.6
    • 0.5.3
    • None
    • Unknown

    Description

      Some embedding models like
      https://huggingface.co/mixedbread-ai/mxbai-embed-large-v1
      and
      https://huggingface.co/nomic-ai/nomic-embed-text-v1.5

      Require special prompt prefixes for use cases like RAG to work better, so we need two more fields in the model configuration for injecting these prefixes: one for index and one for query.

      Attachments

        Activity

          People

            ppantiru Paul Pantiru
            ppantiru Paul Pantiru
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: