Use OpenRouter models through the ngrok AI Gateway.
OpenRouter is a unified API that provides access to hundreds of models from providers like Anthropic, Google, Meta, Mistral, and more through a single endpoint. It requires you to bring your own key—ngrok-managed keys are not available for OpenRouter.
Always prefix OpenRouter model IDs with openrouter:. Without the prefix, the gateway may route to the native provider instead of OpenRouter.
OpenRouter uses a provider/model format for its model IDs. When calling through the ngrok AI Gateway, prefix with openrouter::
What you want
Model string to use
Anthropic Claude via OpenRouter
openrouter:anthropic/claude-3.5-sonnet
Google Gemini via OpenRouter
openrouter:google/gemini-2.0-flash-001
Meta Llama via OpenRouter
openrouter:meta-llama/llama-3.1-8b-instruct
Mistral via OpenRouter
openrouter:mistralai/mistral-7b-instruct
The openrouter: prefix tells the gateway to route to OpenRouter specifically, not to the native provider. Without it, anthropic/claude-3.5-sonnet would fail—the gateway doesn’t recognize the slash-separated format as a valid model without the provider prefix.For the full list of models OpenRouter supports, see openrouter.ai/models.
Instead of each client passing their own key, you can store it once in ngrok Secrets and have the gateway inject it automatically.
Storing your provider key in the gateway makes your endpoint publicly accessible. You must add authorization to prevent unauthorized use and unexpected charges. See Protecting BYOK Endpoints.
With this config, clients can request models: ["anthropic:claude-3.5-sonnet", "openrouter:anthropic/claude-3.5-sonnet"] to fall back to OpenRouter if Anthropic is down.