Prerequisite: You need an AI Gateway endpoint before continuing. Create one using the dashboard quickstart or follow the manual setup guide.
Installation
Basic usage
Configure the OpenAI adapter to use your AI Gateway endpoint:Streaming responses
TanStack AI is built for streaming. Process chunks as they arrive:API route handler
Build API endpoints that stream responses:app/api/chat/route.ts
Using different providers
The AI Gateway routes based on the model name. Use provider prefixes to target specific providers:Automatic model selection
Let the gateway choose the best model withngrok/auto:
traffic-policy.yaml
Tool calling
The AI Gateway supports function/tool calling:Provider options
Pass provider-specific options for fine-grained control:React integration
Use TanStack AI with React for building chat interfaces:app/page.tsx
Environment variables
Set up your environment:.env.local
Next steps
- TanStack AI Documentation - Full SDK reference
- Model Selection Strategies - Configure routing logic
- Configuring Providers - Set up providers and keys