Skip to content

vercel-ai-sdk

Use when integrating AI/LLM features with the Vercel AI SDK — streaming responses, tool calling, multi-provider support, or building chat interfaces. Also use when building AI-powered features in Next.js or React applications.

ModelSource
sonnetpack: ai
Full Reference

┏━ 🔧 vercel-ai-sdk ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓ ┃ Use when integrating AI/LLM features with the … ┃ ┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛

SDK version 6.x (2026). Package ai with provider adapters (@ai-sdk/anthropic, @ai-sdk/openai, @ai-sdk/google, etc.) and React hooks via @ai-sdk/react. Providers auto-read their env vars — no explicit initialization config needed for basic use.

FactValue
Installnpm install ai @ai-sdk/anthropic @ai-sdk/openai @ai-sdk/google @ai-sdk/react
Docshttps://ai-sdk.dev
Anthropic env varANTHROPIC_API_KEY
OpenAI env varOPENAI_API_KEY
Google env varGOOGLE_GENERATIVE_AI_API_KEY
DeprecatedgenerateObject, streamObject — use Output.* API instead
v6 breaking changeuseChat requires DefaultChatTransport; sendMessage({ text }) replaces handleSubmit; isLoading removed
I want to…File
Install, set env vars, and set up a route handlerreference/setup.md
Stream text or handle full stream eventsreference/streaming.md
Call tools, run multi-step agents, or render tool UIreference/tool-use.md
Generate structured output with Output.object(), arrays, or classificationreference/structured-output.md
Build a chat UI with useChat or useCompletionreference/chat-ui.md
Configure multiple providers, custom aliases, or edge runtimereference/provider-config.md
Handle errors, catch by type, or retry with backoffreference/error-handling.md

Usage: Read the reference file matching your current task from the index above. Each file is self-contained with code examples and inline gotchas.