Introduction

Production-ready AI Copilots for any product. Connect any LLM, deploy on your infrastructure, own your data. Built for speed and control.

3 lines of code

Build AI Copilots for Your Product

Production-ready AI Copilots for any product. Connect any LLM, deploy on your infrastructure, own your data. Built for speed and control.

View examples

What Makes This Different?

Prebuilt Copilot UI

Production-ready chat components. Streaming, markdown, code highlighting, and file attachments included.

Tool Execution

Define tools with Zod schemas. AI calls them, you handle the result. Full agentic loop support.

Plug & Play

Works out of the box. No configuration headaches. Just React hooks.

Multi-LLM

OpenAI, Anthropic, Google, xAI, and more. Swap providers without changing your code.


The Gist

import { CopilotProvider } from '@yourgpt/copilot-sdk/react';
import { CopilotChat } from '@yourgpt/copilot-sdk/ui';

function App() {
  return (
    <CopilotProvider runtimeUrl="/api/chat">
      <CopilotChat />
    </CopilotProvider>
  );
}

That's a working AI chat. Want the AI to see your screen when users say "I have an error"?

<CopilotProvider
  runtimeUrl="/api/chat"
  tools={{ screenshot: true, console: true, requireConsent: true }}
>
  <CopilotChat />
</CopilotProvider>

Done. The SDK handles consent UI, captures context, sends it to the AI.


Packages

PackageWhat it does
@yourgpt/copilot-sdkReact hooks, provider, UI components, and core utilities
@yourgpt/llm-sdkMulti-provider LLM integration + streaming

SDK Requirements

ProviderSDK Required
OpenAI, Google, xAIopenai
Anthropic@anthropic-ai/sdk

Most providers use OpenAI-compatible APIs, so you only need one of 2 SDKs. Learn more →


Quick Install

npm install @yourgpt/copilot-sdk @yourgpt/llm-sdk openai
pnpm add @yourgpt/copilot-sdk @yourgpt/llm-sdk openai
bun add @yourgpt/copilot-sdk @yourgpt/llm-sdk openai

The openai SDK works with OpenAI, Google Gemini, and xAI. For Anthropic, use @anthropic-ai/sdk instead. See all providers →


The Flow

User types message

CopilotProvider sends to your /api/chat

Runtime talks to OpenAI/Anthropic/etc

AI decides: respond OR call a tool

Tool executes client-side → result sent back

AI continues until done (agentic loop)

Response streams to UI

All of this is handled. You just define tools and build UI.


Real Example: Navigation Tool

import { useToolWithSchema } from '@yourgpt/copilot-sdk/react';
import { z } from 'zod';

function NavigationTool() {
  const navigate = useNavigate();

  useToolWithSchema({
    name: 'navigate_to_page',
    description: 'Navigate user to a specific page',
    schema: z.object({
      path: z.string().describe('The URL path to navigate to'),
    }),
    handler: async ({ path }) => {
      navigate(path);
      return { success: true, navigatedTo: path };
    },
  });

  return null;
}

Now when user says "take me to settings", AI calls navigate_to_page({ path: '/settings' }).


What's Next?

On this page