LLM Reference
π Planned
 Documentation Under Construction
This reference page will show how to configure Anthropic Claude, OpenAI GPT, and other LLM providers. Most users will configure their LLM in the Quick Startβthis is for reference.
Planned Content
- β Anthropic Claude setup (API key, model selection)
- β OpenAI GPT setup (API key, model selection)
- β Model selection guidance (speed vs. quality tradeoffs)
- β Environment variable configuration
- β Rate limiting and error handling
- β Link to Advanced guide for custom LLM integration
Want to contribute or suggest improvements? Open an issue on GitHub
LLM Providers
When complete, this will show:
Anthropic Claude
from vanna.integrations.anthropic import AnthropicLlmService
llm = AnthropicLlmService(
    api_key="sk-ant-...",
    model="claude-sonnet-4-5"
)OpenAI GPT
from vanna.integrations.openai import OpenAILlmService
llm = OpenAILlmService(
    api_key="sk-...",
    model="gpt-4"
)Choosing a Model
Coming soon: Guidance on:
- Speed: Claude Haiku, GPT-3.5 for quick responses
- Quality: Claude Sonnet/Opus, GPT-4 for complex queries
- Cost: Comparison of pricing across providers
Need a different LLM? See Custom LLM Integration.