LLM models

You can configure which LLM models Sensible uses to find answers in context for LLM-based methods. The following tables list your configuration options.

List method

configurationLLM Engine parameter:
provider: openai
LLM Engine parameter:
provider: anthropic
LLM Engine parameter:
mode: fast
GPT-4o miniClaude 3.5 Haiku
LLM Engine parameter:
mode: thorough
GPT-4oClaude 3.7 Sonnet
LLM Engine parameter:
mode: long
GPT-4o miniClaude 3.5 Haiku

Query Group method

configurationLLM Engine parameter:
provider: openai
LLM Engine parameter:
provider: anthropic
GPT-4o miniClaude 3 Haiku
Multimodal Engine parameter: trueGPT-4o miniClaude 3 Haiku
Source Ids parameter is specifiedGPT-4o miniClaude 3.7 Sonnet

NLP Table method

configurationLLM model
LLM Engine parameter:
provider: openai
GPT-4o
LLM Engine parameter:
provider: anthropic
Claude 3 Haiku