Skip to main content
Prerequisite: You need an AI Gateway endpoint before using these SDK guides. Create one using the dashboard quickstart or follow the manual setup guide.
The AI Gateway works with official and third-party SDKs. Change the base URL in your SDK configuration and you’re connected.

Supported SDKs

OpenAI SDK

Official SDKs for Python, TypeScript, Go, Java, and .NET

Anthropic SDK

Official SDKs for Python, TypeScript, Java, Go, Ruby, C#, and PHP

Vercel AI SDK

Build AI apps with React, Next.js, and streaming

TanStack AI

Type-safe AI library for React and Solid

LangChain

Framework for chains, agents, and RAG

Other SDKs

cURL, HTTP clients

Quick start

The pattern is the same for any SDK—just change the base URL. Which API key do I use?
  • AI Gateway API Keys (recommended): Use your AI Gateway API Key (format: ng-xxxxx-g1-xxxxx). ngrok handles provider keys for you.
  • BYOK (passthrough mode): Use your provider API key (for example, sk-... from OpenAI). See Bring Your Own Keys.
from openai import OpenAI

client = OpenAI(
    base_url="https://your-ai-gateway.ngrok.app/v1",
    api_key="ng-xxxxx-g1-xxxxx"  # Your AI Gateway API Key
)

What works through the gateway

Everything your SDK supports works through the gateway:
FeatureSupported
Chat Completions API
Messages API✅ (for Anthropic SDKs)
Responses API
Streaming
Function/tool calling
Embeddings
Async clients
Retries✅ (enhanced by gateway)

Gateway benefits

  • Automatic failover - If one provider fails, the gateway tries another
  • Key rotation - Use multiple provider API keys to avoid rate limits
  • Provider switching - Change providers without changing code
  • Observability - Track usage, latency, and errors across all requests

Using different providers

Use model prefixes to route to specific providers:
# OpenAI
client.chat.completions.create(model="openai:gpt-4o", ...)

# Anthropic
client.chat.completions.create(model="anthropic:claude-3-5-sonnet-latest", ...)

# Your self-hosted Ollama
client.chat.completions.create(model="ollama:llama3.2", ...)
Or let the gateway choose with ngrok/auto:
client.chat.completions.create(model="ngrok/auto", ...)

Next steps

Choose your SDK guide to get started: