Daydreams Logo
Providers

AI SDK Integration

Leveraging the Vercel AI SDK with Daydreams.

What is the Vercel AI SDK?

The Vercel AI SDK provides a unified way to connect to different AI providers like OpenAI, Anthropic, Google, and many others. Instead of learning each provider's unique API, you use one consistent interface.

Why This Matters for Your Agent

Daydreams is built on top of the Vercel AI SDK, which means you get:

Easy Provider Switching

easy-switching.ts
// Switch from OpenAI to Anthropic by changing one line
// model: openai("gpt-4o")           // OpenAI
model: anthropic("claude-3-sonnet"); // Anthropic
// Everything else stays the same!

Access to All Major Providers

  • OpenAI - GPT-4, GPT-4o, GPT-3.5
  • Anthropic - Claude 3 Opus, Sonnet, Haiku
  • Google - Gemini Pro, Gemini Flash
  • Groq - Ultra-fast Llama, Mixtral models
  • OpenRouter - Access to 100+ models through one API
  • And many more - See the full list

The Problem: Each AI Provider is Different

Without a unified SDK, you'd need different code for each provider:

without-ai-sdk.ts
// ❌ Without AI SDK - different APIs for each provider
if (provider === 'openai') {
  const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
  const response = await openai.chat.completions.create({
    model: "gpt-4",
    messages: [...],
  });
} else if (provider === 'anthropic') {
  const anthropic = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY });
  const response = await anthropic.messages.create({
    model: "claude-3-sonnet-20240229",
    messages: [...],
  });
}
// Different response formats, error handling, etc.

The Solution: One Interface for All Providers

With the AI SDK, all providers work the same way:

with-ai-sdk.ts
// ✅ With AI SDK - same code for any provider
const agent = createDreams({
  model: openai("gpt-4o"), // Or anthropic("claude-3-sonnet")
  // model: groq("llama3-70b"),   // Or any other provider
  // Everything else stays identical!
});

Setting Up Your First Provider

1. Choose Your Provider

For this example, we'll use OpenAI, but the process is similar for all providers.

2. Install the Provider Package

install-openai.sh
npm install @ai-sdk/openai

3. Get Your API Key

  1. Go to OpenAI's API platform
  2. Create a new API key
  3. Add it to your environment:
.env
OPENAI_API_KEY=your_api_key_here

4. Use in Your Agent

openai-agent.ts
import { createDreams } from "@daydreamsai/core";
import { openai } from "@ai-sdk/openai";
 
const agent = createDreams({
  model: openai("gpt-4o-mini"), // Fast and cost-effective
  // model: openai("gpt-4o"),    // More capable but slower/pricier
 
  // Your agent configuration
  extensions: [...],
  contexts: [...],
});
 
await agent.start();

All Supported Providers

OpenAI

openai-setup.ts
// Install: npm install @ai-sdk/openai
import { openai } from "@ai-sdk/openai";
 
model: openai("gpt-4o-mini"); // Fast, cheap
model: openai("gpt-4o"); // Most capable
model: openai("gpt-3.5-turbo"); // Legacy but cheap

Get API key: platform.openai.com

Anthropic (Claude)

anthropic-setup.ts
// Install: npm install @ai-sdk/anthropic
import { anthropic } from "@ai-sdk/anthropic";
 
model: anthropic("claude-3-haiku-20240307"); // Fast, cheap
model: anthropic("claude-3-sonnet-20240229"); // Balanced
model: anthropic("claude-3-opus-20240229"); // Most capable

Get API key: console.anthropic.com

Google (Gemini)

google-setup.ts
// Install: npm install @ai-sdk/google
import { google } from "@ai-sdk/google";
 
model: google("gemini-1.5-flash"); // Fast, cheap
model: google("gemini-1.5-pro"); // More capable

Get API key: aistudio.google.com

Groq (Ultra-Fast)

groq-setup.ts
// Install: npm install @ai-sdk/groq
import { createGroq } from "@ai-sdk/groq";
const groq = createGroq();
 
model: groq("llama3-70b-8192"); // Fast Llama
model: groq("mixtral-8x7b-32768"); // Fast Mixtral

Get API key: console.groq.com

OpenRouter (100+ Models)

openrouter-setup.ts
// Install: npm install @openrouter/ai-sdk-provider
import { openrouter } from "@openrouter/ai-sdk-provider";
 
model: openrouter("anthropic/claude-3-opus");
model: openrouter("google/gemini-pro");
model: openrouter("meta-llama/llama-3-70b");
// And 100+ more models!

Get API key: openrouter.ai

Switching Providers

The beauty of the AI SDK integration is how easy it is to switch:

provider-switching.ts
// Development: Use fast, cheap models
const devAgent = createDreams({
  model: groq("llama3-8b-8192"), // Ultra-fast for testing
  // ... rest of config
});
 
// Production: Use more capable models
const prodAgent = createDreams({
  model: openai("gpt-4o"), // High quality for users
  // ... exact same config
});
 
// Experimenting: Try different providers
const experimentAgent = createDreams({
  model: anthropic("claude-3-opus"), // Different reasoning style
  // ... exact same config
});

Environment Variables

Set up your API keys in your .env file:

.env
# OpenAI
OPENAI_API_KEY=sk-...
 
# Anthropic
ANTHROPIC_API_KEY=sk-ant-...
 
# Google
GOOGLE_GENERATIVE_AI_API_KEY=AI...
 
# Groq
GROQ_API_KEY=gsk_...
 
# OpenRouter
OPENROUTER_API_KEY=sk-or-...

The AI SDK automatically picks up the right environment variable for each provider.

Model Recommendations

For Development/Testing

  • Groq Llama3-8B - Ultra-fast responses for quick iteration
  • OpenAI GPT-4o-mini - Good balance of speed and capability

For Production

  • OpenAI GPT-4o - Best overall capability and reliability
  • Anthropic Claude-3-Sonnet - Great reasoning, good for complex tasks

For Cost Optimization

  • OpenAI GPT-3.5-turbo - Cheapest OpenAI option
  • Anthropic Claude-3-Haiku - Cheapest Anthropic option
  • Google Gemini Flash - Very affordable with good performance

For Special Use Cases

  • OpenRouter - Access models not available elsewhere
  • Local models - Use Ollama for privacy

Advanced Configuration

You can also configure providers with custom settings:

advanced-config.ts
import { createOpenAI } from "@ai-sdk/openai";
import { createAnthropic } from "@ai-sdk/anthropic";
 
// Custom OpenAI configuration
const customOpenAI = createOpenAI({
  apiKey: process.env.CUSTOM_OPENAI_KEY,
  baseURL: "https://your-proxy.com/v1", // Use a proxy
});
 
// Custom Anthropic configuration
const customAnthropic = createAnthropic({
  apiKey: process.env.CUSTOM_ANTHROPIC_KEY,
  baseURL: "https://your-endpoint.com", // Custom endpoint
});
 
const agent = createDreams({
  model: customOpenAI("gpt-4o"),
  // ... rest of config
});

Troubleshooting

Missing API Key

Error: Missing API key

Solution: Make sure your environment variable is set and the process can access it.

Model Not Found

Error: Model 'gpt-5' not found

Solution: Check the AI SDK docs for available model names.

Rate Limits

Error: Rate limit exceeded

Solution: Switch to a provider with higher limits or implement retry logic.

Next Steps

Key Takeaways

  • One interface, many providers - Same code works with OpenAI, Anthropic, Google, etc.
  • Easy switching - Change providers by changing one line of code
  • Automatic key handling - Environment variables work automatically
  • Cost flexibility - Use cheap models for development, premium for production
  • Future-proof - New providers added to AI SDK work immediately with Daydreams

The AI SDK integration gives you the freedom to choose the best model for your use case without vendor lock-in.