Getting Started

Get up and running in 2 minutes

Prerequisites: Node 18+, an LLM API key (OpenAI, Anthropic, etc.)

Starting fresh?

Use the CLI to scaffold a complete project: npx create-ai-copilot


1. Install Dependencies

pnpm add @yourgpt/copilot-sdk @yourgpt/llm-sdk @anthropic-ai/sdk zod
pnpm add @yourgpt/copilot-sdk @yourgpt/llm-sdk openai zod

The openai package works with OpenAI, Google Gemini, and xAI (all OpenAI-compatible APIs).


2. Create API Route

app/api/chat/route.ts
import { streamText } from '@yourgpt/llm-sdk';
import { anthropic } from '@yourgpt/llm-sdk/anthropic';

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: anthropic('claude-sonnet-4-20250514'),
    system: 'You are a helpful assistant.',
    messages,
  });

  return result.toTextStreamResponse();
}
.env.local
ANTHROPIC_API_KEY=sk-ant-...
app/api/chat/route.ts
import { streamText } from '@yourgpt/llm-sdk';
import { openai } from '@yourgpt/llm-sdk/openai';

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: openai('gpt-4o'),
    system: 'You are a helpful assistant.',
    messages,
  });

  return result.toTextStreamResponse();
}
.env.local
OPENAI_API_KEY=sk-...
app/api/chat/route.ts
import { streamText } from '@yourgpt/llm-sdk';
import { google } from '@yourgpt/llm-sdk/google';

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: google('gemini-2.0-flash'),
    system: 'You are a helpful assistant.',
    messages,
  });

  return result.toTextStreamResponse();
}
.env.local
GOOGLE_API_KEY=...

See Providers for more providers and configuration options.


3. Add Provider

app/providers.tsx
'use client';

import { CopilotProvider } from '@yourgpt/copilot-sdk/react';

export function Providers({ children }: { children: React.ReactNode }) {
  return (
    <CopilotProvider runtimeUrl="/api/chat">
      {children}
    </CopilotProvider>
  );
}

Wrap your app with the provider:

app/layout.tsx
import { Providers } from './providers';

export default function RootLayout({ children }) {
  return (
    <html>
      <body>
        <Providers>{children}</Providers>
      </body>
    </html>
  );
}

4. Add Chat Component

app/page.tsx
import { CopilotChat } from '@yourgpt/copilot-sdk/ui';

export default function Home() {
  return (
    <div className="h-screen p-4">
      <CopilotChat className="h-full rounded-xl border" />
    </div>
  );
}

5. Configure Tailwind

The SDK uses Tailwind CSS utility classes. Add the SDK source path based on your CSS file location:

CSS file location@source path
app/globals.css@source "../node_modules/@yourgpt/copilot-sdk/dist/**/*.{js,ts,jsx,tsx}";
src/app/globals.css@source "../../node_modules/@yourgpt/copilot-sdk/dist/**/*.{js,ts,jsx,tsx}";
globals.css (root)@source "node_modules/@yourgpt/copilot-sdk/dist/**/*.{js,ts,jsx,tsx}";
app/globals.css
@import "tailwindcss";

/* Include SDK package for Tailwind class detection */
@source "../node_modules/@yourgpt/copilot-sdk/dist/**/*.{js,ts,jsx,tsx}";

@custom-variant dark (&:is(.dark *));

/* Your existing shadcn/ui @theme and variables... */

The SDK works automatically with your existing shadcn/ui CSS variables.


Done!

Run pnpm dev and you have a working AI chat.


Next Steps

On this page