Vercel AI SDK - Add AI chat/features to your apps. Use when building AI-powered products, chatbots, or any AI integration.
Add AI to your apps. Chat, streaming, multiple providers.
| Want | AI SDK Does It |
|---|---|
| Chat with AI in your app | ✓ Yes |
| Streaming responses (like ChatGPT) | ✓ Yes |
| Use Claude, GPT, Gemini | ✓ All of them |
| AI-generated content | ✓ Yes |
| Tool use / function calling | ✓ Yes |
Before suggesting AI patterns:
| Dimension | Spectrum |
|---|---|
| Interaction | Single prompt ←→ Multi-turn conversation |
| Response | Complete (generateText) ←→ Streaming (streamText) |
| Control | Simple prompt ←→ System prompt + tools |
| Provider | Budget (Gemini Flash) ←→ Premium (Claude/GPT-4) |
| Complexity | Text only ←→ Multi-modal (text + tools) |
| If Context Is... | Then Consider... |
|---|---|
| Chat interface | useChat hook + streamText |
| One-off generation | generateText (no streaming) |
| Need actions | Add tools parameter |
| Cost-sensitive | Gemini Flash or GPT-3.5 |
| High quality needed | Claude Sonnet or GPT-4o |
| Long context | Claude (200K) or Gemini (1M) |
Without AI SDK: You'd write different code for Claude, different code for GPT, handle streaming yourself, manage state yourself. Messy.
With AI SDK: One way to do it. Works with any AI. Handles streaming. Handles state.
# Core
pnpm add ai
# Pick your AI provider(s)
pnpm add @ai-sdk/anthropic # Claude
pnpm add @ai-sdk/openai # GPT
pnpm add @ai-sdk/google # Gemini
# .env.local
ANTHROPIC_API_KEY=sk-ant-...
OPENAI_API_KEY=sk-...
GOOGLE_GENERATIVE_AI_API_KEY=...
// app/api/chat/route.ts
import { anthropic } from "@ai-sdk/anthropic"
import { streamText } from "ai"
export async function POST(req: Request) {
const { messages } = await req.json()
const result = streamText({
model: anthropic("claude-sonnet-4-20250514"),
messages,
})
return result.toDataStreamResponse()
}
// components/chat.tsx
"use client"
import { useChat } from "ai/react"
export function Chat() {
const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat()
return (
<div className="flex flex-col h-screen">
{/* Messages */}
<div className="flex-1 overflow-y-auto p-4 space-y-4">
{messages.map((m) => (
<div key={m.id} className={m.role === "user" ? "text-right" : "text-left"}>
<span className={`inline-block p-3 rounded-lg ${
m.role === "user" ? "bg-blue-500 text-white" : "bg-gray-100"
}`}>
{m.content}
</span>
</div>
))}
</div>
{/* Input */}
<form onSubmit={handleSubmit} className="p-4 border-t">
<input
value={input}
onChange={handleInputChange}
placeholder="Type a message..."
className="w-full p-3 border rounded-lg"
disabled={isLoading}
/>
</form>
</div>
)
}
That's it. You now have a streaming AI chat.
// Just change this line:
import { anthropic } from "@ai-sdk/anthropic"