Skip to main content
Anthropic provides Claude models. The AI Gateway supports Anthropic as a managed provider—no Anthropic account needed—and also works with your own Anthropic API key. Anthropic models can be called using either the OpenAI-compatible API (chat completions format) or the native Anthropic API (messages format). Both are supported.

Managed keys setup

No Anthropic account needed. Follow the quickstart to create your gateway and API key, then point your SDK at the gateway:
from openai import OpenAI

client = OpenAI(
    base_url="https://your-ai-gateway.ngrok.app/v1",
    api_key="ng-xxxxx-g1-xxxxx"  # Your AI Gateway API Key
)

response = client.chat.completions.create(
    model="claude-opus-4-6",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Bring your own key

Use your own Anthropic API key. Pass it directly—the gateway forwards it to Anthropic.
1

Create an AI Gateway

If you don’t have one yet, follow the quickstart to create your AI Gateway endpoint.
2

Get an Anthropic API key

Create an account at console.anthropic.com and generate an API key. Keys start with sk-ant-.
3

Make a request

Pass your key directly—the gateway forwards it to Anthropic.
from openai import OpenAI

client = OpenAI(
    base_url="https://your-ai-gateway.ngrok.app/v1",
    api_key="sk-ant-..."  # Your Anthropic key, forwarded by gateway
)

response = client.chat.completions.create(
    model="claude-opus-4-6",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Store key in the gateway

Instead of each client passing their own key, you can store it once in ngrok Secrets and have the gateway inject it automatically.
Storing your provider key in the gateway makes your endpoint publicly accessible. You must add authorization to prevent unauthorized use and unexpected charges. See Protecting BYOK Endpoints.
1

Store your key in ngrok secrets

ngrok api secrets create \
  --name anthropic \
  --secret-data '{"api-key": "sk-ant-..."}'
Or use the Vaults & Secrets dashboard.
2

Configure your Traffic Policy

on_http_request:
  - type: ai-gateway
    config:
      providers:
        - id: "anthropic"
          api_keys:
            - value: ${secrets.get('anthropic', 'api-key')}
3

Make a request

Clients no longer need an Anthropic key—pass any value for api_key.
from openai import OpenAI

client = OpenAI(
    base_url="https://your-ai-gateway.ngrok.app/v1",
    api_key="unused"  # Gateway injects your Anthropic key
)

response = client.chat.completions.create(
    model="claude-opus-4-6",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Explicit provider routing

Use the anthropic: prefix to force routing to Anthropic specifically:
response = client.chat.completions.create(
    model="anthropic:claude-opus-4-6",
    messages=[{"role": "user", "content": "Hello!"}]
)

Available models

See the model catalog for the full list of supported Claude models.

Next steps