Skip to main content

Providers Configuration

Detailed guide for configuring LLM providers.

Supported Providers

DeepSeek

{
"NAME": "deepseek",
"HOST": "https://api.deepseek.com",
"APIKEY": "your-api-key",
"MODELS": ["deepseek-chat", "deepseek-coder"],
"transformers": ["anthropic"]
}

Groq

{
"NAME": "groq",
"HOST": "https://api.groq.com/openai/v1",
"APIKEY": "your-api-key",
"MODELS": ["llama-3.3-70b-versatile"],
"transformers": ["anthropic"]
}

Gemini

{
"NAME": "gemini",
"HOST": "https://generativelanguage.googleapis.com/v1beta",
"APIKEY": "your-api-key",
"MODELS": ["gemini-1.5-pro"],
"transformers": ["anthropic"]
}

OpenRouter

{
"NAME": "openrouter",
"HOST": "https://openrouter.ai/api/v1",
"APIKEY": "your-api-key",
"MODELS": ["anthropic/claude-3.5-sonnet"],
"transformers": ["anthropic"]
}

Provider Configuration Options

FieldTypeRequiredDescription
NAMEstringYesUnique provider identifier
HOSTstringYesAPI base URL
APIKEYstringYesAPI authentication key
MODELSstring[]NoList of available models
transformersstring[]NoList of transformers to apply

Model Selection

When selecting a model in routing, use the format:

{provider-name},{model-name}

For example:

deepseek,deepseek-chat

Next Steps