Why use the AI Gateway?
No Provider Accounts Needed
Use OpenAI, Anthropic, and more without signing up for each provider. ngrok manages the provider keys for you.
Automatic Failover
If one provider fails, the gateway automatically tries the next model, provider, or key.
Compatible With Popular SDKs
Works with official and third-party SDKs. Change the base URL and use your AI Gateway API Key.
Self-Hosted Models
Route to local models like Ollama or vLLM alongside cloud providers using Bring Your Own Keys.
Quick example
Point your SDK at your ngrok endpoint and use your AI Gateway API Key:- Receives your request with your AI Gateway API Key
- Validates your key
- Selects which model and provider to use
- Forwards the request with ngrok’s managed provider API keys
- If it fails, retries with the next option in your failover chain
- Returns the response
What can you do?
| Use Case | Description |
|---|---|
| Use without provider accounts | Get started with OpenAI and Anthropic using just an AI Gateway API Key |
| Automatic model selection | Use ngrok/auto to let the gateway pick the best model |
| Multi-provider failover | If one provider fails, automatically try another |
| Custom selection strategies | Define exactly how models are selected using CEL expressions |
| Cost-based routing | Route to the cheapest available model automatically |
| Access control | Restrict which providers and models clients can use |
| Self-hosted models | Route to Ollama, vLLM, or other local inference servers (requires BYOK) |
| Content modification | Redact PII, sanitize responses, or inject prompts |
AI Gateway requires the Pay-as-you-go plan. See Credits for pricing details.