LLM Reference

πŸ“‹ Planned

Documentation Under Construction

This reference page will show how to configure Anthropic Claude, OpenAI GPT, and other LLM providers. Most users will configure their LLM in the Quick Startβ€”this is for reference.

Planned Content

  • βœ“ Anthropic Claude setup (API key, model selection)
  • βœ“ OpenAI GPT setup (API key, model selection)
  • βœ“ Model selection guidance (speed vs. quality tradeoffs)
  • βœ“ Environment variable configuration
  • βœ“ Rate limiting and error handling
  • βœ“ Link to Advanced guide for custom LLM integration

Want to contribute or suggest improvements? Open an issue on GitHub

LLM Providers

When complete, this will show:

Anthropic Claude

from vanna.integrations.anthropic import AnthropicLlmService

llm = AnthropicLlmService(
    api_key="sk-ant-...",
    model="claude-sonnet-4-5"
)

OpenAI GPT

from vanna.integrations.openai import OpenAILlmService

llm = OpenAILlmService(
    api_key="sk-...",
    model="gpt-4"
)

Choosing a Model

Coming soon: Guidance on:

  • Speed: Claude Haiku, GPT-3.5 for quick responses
  • Quality: Claude Sonnet/Opus, GPT-4 for complex queries
  • Cost: Comparison of pricing across providers

Need a different LLM? See Custom LLM Integration.