Managed providers
These providers are available with AI Gateway API Keys—no provider accounts needed. ngrok handles authentication automatically.OpenAI
GPT and o-series models. Managed keys available.
Anthropic
Claude models. Managed keys available. Supports both OpenAI and Anthropic SDK formats.
BYOK providers
These providers require you to bring your own key. See each provider’s page for setup instructions.OpenRouter
Access hundreds of models from multiple providers through a single API.
Gemini models from Google AI Studio.
Groq
LPU-accelerated inference for open-source models (Llama, Mixtral).
DeepSeek
High-performance reasoning and chat models.
Hyperbolic
Open-source model hosting with high-performance inference.
InceptionLabs
Diffusion-based language models for fast text generation.
Inference.net
Distributed inference network for AI models at scale.
Self-hosted providers
Run open-source models on your own infrastructure and connect them to the gateway.Ollama
Run open-source models locally with Ollama.
vLLM
High-performance inference server.
LM Studio
Desktop app for local model inference.
Azure OpenAI
Microsoft’s OpenAI service on Azure.
How provider selection works
When a request arrives, the gateway determines which provider to use:- Explicit provider prefix: if the model name includes a provider prefix (for example,
openai:gpt-4ooropenrouter:anthropic/claude-3.5-sonnet), that provider is used - Catalog lookup: the gateway looks up the model ID in its catalog to find the default provider
- Selection strategy: if configured, model selection strategies can override the default
Next steps
- Model Catalog: browse all available models by provider
- Bring Your Own Keys: manage provider API keys centrally
- Multi-provider failover: automatic failover across providers