Skip to main content
OpenRouter is a unified API that provides access to hundreds of models from providers like Anthropic, Google, Meta, Mistral, and more through a single endpoint. It requires you to bring your own key—ngrok-managed keys are not available for OpenRouter.

Setup

1

Create an AI Gateway endpoint

If you don’t have one yet, follow the quickstart to create your AI Gateway endpoint.
2

Get an OpenRouter API key

Create an account at openrouter.ai and generate an API key from your dashboard. Keys start with sk-or-v1-.
3

Make a request

Pass your key directly—the gateway forwards it to OpenRouter.
from openai import OpenAI

client = OpenAI(
    base_url="https://your-ai-gateway.ngrok.app/v1",
    api_key="sk-or-v1-..."  # Your OpenRouter key, forwarded by gateway
)

response = client.chat.completions.create(
    model="openrouter:anthropic/claude-3.5-sonnet",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Model names

Always prefix OpenRouter model IDs with openrouter:. Without the prefix, the gateway may route to the native provider instead of OpenRouter.
OpenRouter uses a provider/model format for its model IDs. When calling through the ngrok AI Gateway, prefix with openrouter::
What you wantModel string to use
Anthropic Claude via OpenRouteropenrouter:anthropic/claude-3.5-sonnet
Google Gemini via OpenRouteropenrouter:google/gemini-2.0-flash-001
Meta Llama via OpenRouteropenrouter:meta-llama/llama-3.1-8b-instruct
Mistral via OpenRouteropenrouter:mistralai/mistral-7b-instruct
The openrouter: prefix tells the gateway to route to OpenRouter specifically, not to the native provider. Without it, anthropic/claude-3.5-sonnet would fail—the gateway doesn’t recognize the slash-separated format as a valid model without the provider prefix. For the full list of models OpenRouter supports, see openrouter.ai/models.

Store key in the gateway

Instead of each client passing their own key, you can store it once in ngrok Secrets and have the gateway inject it automatically.
Storing your provider key in the gateway makes your endpoint publicly accessible. You must add authorization to prevent unauthorized use and unexpected charges. See Protecting BYOK Endpoints.
1

Store your key in ngrok secrets

ngrok api secrets create \
  --name openrouter \
  --secret-data '{"api-key": "sk-or-v1-..."}'
Or use the Vaults & Secrets dashboard.
2

Configure your traffic policy

on_http_request:
  - type: ai-gateway
    config:
      providers:
        - id: "openrouter"
          api_keys:
            - value: ${secrets.get('openrouter', 'api-key')}
3

Make a request

Clients no longer need an OpenRouter key—pass any value for api_key.
from openai import OpenAI

client = OpenAI(
    base_url="https://your-ai-gateway.ngrok.app/v1",
    api_key="unused"  # Gateway injects your OpenRouter key
)

response = client.chat.completions.create(
    model="openrouter:anthropic/claude-3.5-sonnet",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Failover with another provider

Use OpenRouter as a fallback if your primary provider is unavailable:
on_http_request:
  - type: ai-gateway
    config:
      providers:
        - id: "anthropic"
          api_keys:
            - value: ${secrets.get('anthropic', 'api-key')}
        - id: "openrouter"
          api_keys:
            - value: ${secrets.get('openrouter', 'api-key')}
      model_selection:
        strategy:
          - "ai.models.filter(m, m.provider_id == 'anthropic')"
          - "ai.models.filter(m, m.provider_id == 'openrouter')"
With this config, clients can request models: ["anthropic:claude-3.5-sonnet", "openrouter:anthropic/claude-3.5-sonnet"] to fall back to OpenRouter if Anthropic is down.

Next steps