LLM models

You can configure which LLM models Sensible uses to find answers in context for LLM-based methods. The following tables list your configuration options.

List method

configurationLLM Engine parameter:
provider: openai
LLM Engine parameter:
provider: anthropic
LLM Engine parameter:
provider: google
LLM Engine parameter:
mode: fast
GPT-4o miniClaude 3.5 HaikuGemini 2.5 Flash-Lite
LLM Engine parameter:
mode: thorough
GPT-4oClaude 3.7 SonnetGemini 2.5 Flash-Lite
LLM Engine parameter:
mode: long
GPT-4o miniClaude 3.5 HaikuGemini 2.5 Flash-Lite

Query Group method

configurationLLM Engine parameter:
provider: openai
LLM Engine parameter:
provider: anthropic
LLM Engine parameter:
provider: google
defaultGPT-4o miniClaude 3 HaikuGemini 2.5 Flash-Lite
Multimodal Engine parameter: trueGPT-4o miniClaude 3 HaikuGemini 2.5 Flash-Lite
Source Ids parameter is specifiedGPT-4o miniClaude 3.7 SonnetGemini 2.5 Flash-Lite

NLP Table method

configurationLLM Engine parameter:
provider: openai
LLM Engine parameter:
provider: anthropic
LLM Engine parameter:
provider: google
defaultGPT-4oClaude 3 HaikuGemini 2.5 Flash-Lite