Vercel AI SDK
Use BrainstormRouter with the Vercel AI SDK for streaming React UIs.
Setup
npm install ai @ai-sdk/openai brainstormrouter
Provider configuration
import { createOpenAI } from "@ai-sdk/openai";
const brainstorm = createOpenAI({
baseURL: "https://api.brainstormrouter.com/v1",
apiKey: "br_live_...",
});
Text generation
import { generateText } from "ai";
const { text } = await generateText({
model: brainstorm("anthropic/claude-sonnet-4"),
prompt: "Explain quantum computing in one paragraph.",
});
Streaming
import { streamText } from "ai";
const result = streamText({
model: brainstorm("anthropic/claude-sonnet-4"),
prompt: "Write a poem about APIs.",
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
React Server Components
// app/api/chat/route.ts
import { streamText } from "ai";
import { createOpenAI } from "@ai-sdk/openai";
const brainstorm = createOpenAI({
baseURL: "https://api.brainstormrouter.com/v1",
apiKey: process.env.BRAINSTORMROUTER_API_KEY!,
});
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: brainstorm("anthropic/claude-sonnet-4"),
messages,
});
return result.toDataStreamResponse();
}
With memory
Pre-load BrainstormRouter memory, then use it via agentic mode:
import BrainstormRouter from "brainstormrouter";
// Bootstrap memory once
const br = new BrainstormRouter();
await br.memory.init([{ content: "Company: Acme Corp. Product: Widget Pro.", source: "about.md" }]);
// Vercel AI SDK uses the enriched model
const result = streamText({
model: brainstorm("anthropic/claude-sonnet-4"),
messages: [{ role: "user", content: "What product does the company sell?" }],
// Pass mode via headers or body extension
});
Tool calling
import { generateText, tool } from "ai";
import { z } from "zod";
const { text, toolResults } = await generateText({
model: brainstorm("anthropic/claude-sonnet-4"),
prompt: "What's the weather in Tokyo?",
tools: {
weather: tool({
description: "Get weather for a city",
parameters: z.object({ city: z.string() }),
execute: async ({ city }) => ({ temp: "18°C", condition: "Cloudy" }),
}),
},
});
Model variants
Use routing variants directly:
const cheap = brainstorm("openai/gpt-4o-mini:floor");
const fast = brainstorm("anthropic/claude-haiku-4-5:fast");
const best = brainstorm("anthropic/claude-sonnet-4:best");