Prerequisite: You need an AI Gateway endpoint before continuing. Create one using the dashboard quickstart or follow the manual setup guide.
Installation
Basic usage
Point the OpenAI provider at your AI Gateway endpoint:Streaming responses
The AI Gateway fully supports streaming. UsestreamText for real-time responses:
Chat interface
Build chat applications with theuseChat hook:
app/page.tsx
app/api/chat/route.ts
Using different providers
The AI Gateway routes based on the model name. Use provider prefixes to be explicit:Automatic model selection
Let the gateway choose the best model withngrok/auto:
traffic-policy.yaml
Environment variables
Set up your environment:.env.local
Tool calling
The AI Gateway supports function/tool calling:Error handling
Handle errors gracefully:Next steps
- Vercel AI SDK Documentation - Full SDK reference
- Model Selection Strategies - Configure routing logic
- Configuring Providers - Set up providers and keys