LLM models
You can configure which LLM models Sensible uses to find answers in context for LLM-based methods. The following tables list your configuration options.
List method
| configuration | LLM Engine parameter: provider: openai | LLM Engine parameter: provider: anthropic | LLM Engine parameter: provider: google |
|---|---|---|---|
| LLM Engine parameter: mode: fast | GPT-4o mini | Claude 3.5 Haiku | Gemini 2.5 Flash-Lite |
| LLM Engine parameter: mode: thorough | GPT-4o | Claude 3.7 Sonnet | Gemini 2.5 Flash-Lite |
| LLM Engine parameter: mode: long | GPT-4o mini | Claude 3.5 Haiku | Gemini 2.5 Flash-Lite |
Query Group method
| configuration | LLM Engine parameter: provider: openai | LLM Engine parameter: provider: anthropic | LLM Engine parameter: provider: google |
|---|---|---|---|
| default | GPT-4o mini | Claude 3 Haiku | Gemini 2.5 Flash-Lite |
| Multimodal Engine parameter: true | GPT-4o mini | Claude 3 Haiku | Gemini 2.5 Flash-Lite |
| Source Ids parameter is specified | GPT-4o mini | Claude 3.7 Sonnet | Gemini 2.5 Flash-Lite |
NLP Table method
| configuration | LLM Engine parameter: provider: openai | LLM Engine parameter: provider: anthropic | LLM Engine parameter: provider: google |
|---|---|---|---|
| default | GPT-4o | Claude 3 Haiku | Gemini 2.5 Flash-Lite |
Updated 1 day ago