Skip to main content
Prerequisite: You need an AI Gateway endpoint before using these SDK guides. Create one using the dashboard quickstart or follow the manual setup guide.
The AI Gateway works with any SDK that supports the OpenAI API format. Just change the baseURL to your gateway endpoint and you’re connected.

Supported SDKs

Quick start

The pattern is the same for any SDK—just change the base URL:
from openai import OpenAI

client = OpenAI(
    base_url="https://your-ai-subdomain.ngrok.app/v1",
    api_key="your-api-key"
)

What works through the gateway

Everything your SDK supports works through the gateway:
FeatureSupported
Chat completions
Streaming
Function/tool calling
Embeddings
Async clients
Retries✅ (enhanced by gateway)

Gateway benefits

When you route SDK requests through the AI Gateway:
  • Automatic failover - If one provider fails, the gateway tries another
  • Key rotation - Use multiple provider API keys to avoid rate limits
  • Provider switching - Change providers without changing code
  • Observability - Track usage, latency, and errors across all requests

Using different providers

Use model prefixes to route to specific providers:
# OpenAI
client.chat.completions.create(model="openai:gpt-4o", ...)

# Anthropic (through the gateway)
client.chat.completions.create(model="anthropic:claude-3-5-sonnet-latest", ...)

# Your self-hosted Ollama
client.chat.completions.create(model="ollama:llama3.2", ...)
Or let the gateway choose with ngrok/auto:
client.chat.completions.create(model="ngrok/auto", ...)

Next steps

Choose your SDK guide to get started: