Skip to main content
OpenAI provides GPT and o-series models. The AI Gateway supports OpenAI as a managed provider—no OpenAI account needed—and also works with your own OpenAI API key.

Managed keys setup

No OpenAI account needed. Follow the quickstart to create your gateway and API key, then point your SDK at the gateway:
from openai import OpenAI

client = OpenAI(
    base_url="https://your-ai-gateway.ngrok.app/v1",
    api_key="ng-xxxxx-g1-xxxxx"  # Your AI Gateway API Key
)

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Bring your own key

Use your own OpenAI API key. Pass it directly—the gateway forwards it to OpenAI.
1

Create an AI Gateway

If you don’t have one yet, follow the quickstart to create your AI Gateway endpoint.
2

Get an OpenAI API key

Create an account at platform.openai.com and generate an API key. Keys start with sk-proj-.
3

Make a request

Pass your key directly—the gateway forwards it to OpenAI.
from openai import OpenAI

client = OpenAI(
    base_url="https://your-ai-gateway.ngrok.app/v1",
    api_key="sk-proj-..."  # Your OpenAI key, forwarded by gateway
)

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Store key in the gateway

Instead of each client passing their own key, you can store it once in ngrok Secrets and have the gateway inject it automatically.
Storing your provider key in the gateway makes your endpoint publicly accessible. You must add authorization to prevent unauthorized use and unexpected charges. See Protecting BYOK Endpoints.
1

Store your key in ngrok secrets

ngrok api secrets create \
  --name openai \
  --secret-data '{"api-key": "sk-proj-..."}'
Or use the Vaults & Secrets dashboard.
2

Configure your Traffic Policy

on_http_request:
  - type: ai-gateway
    config:
      providers:
        - id: "openai"
          api_keys:
            - value: ${secrets.get('openai', 'api-key')}
3

Make a request

Clients no longer need an OpenAI key—pass any value for api_key.
from openai import OpenAI

client = OpenAI(
    base_url="https://your-ai-gateway.ngrok.app/v1",
    api_key="unused"  # Gateway injects your OpenAI key
)

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Explicit provider routing

Use the openai: prefix to force routing to OpenAI specifically:
response = client.chat.completions.create(
    model="openai:gpt-4o",  # Always routes to OpenAI, not another provider
    messages=[{"role": "user", "content": "Hello!"}]
)
This is useful in multi-provider setups where the same model name might exist on multiple providers.

Available models

See the model catalog for the full list of supported models and their context windows.

Next steps