Providers Overview

Connect to any LLM provider

Copilot SDK supports multiple LLM providers through @yourgpt/llm-sdk. Switch providers without changing your frontend code.

All providers use the same API. Change one line in your backend to switch from OpenAI to Anthropic.


How It Works

Install the SDK and the provider's official SDK. Each provider returns a model instance that works with generateText() and streamText().

Installation

npm install @yourgpt/copilot-sdk @yourgpt/llm-sdk openai

OpenAI, Google Gemini, and xAI all use the openai SDK (OpenAI-compatible APIs).

npm install @yourgpt/copilot-sdk @yourgpt/llm-sdk @anthropic-ai/sdk

Backend Setup

app/api/chat/route.ts
import { streamText } from '@yourgpt/llm-sdk';
import { openai } from '@yourgpt/llm-sdk/openai';

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: openai('gpt-4o-mini'),
    system: 'You are a helpful assistant.',
    messages,
  });

  return result.toTextStreamResponse();
}

Frontend Setup

app/providers.tsx
'use client';

import { CopilotProvider } from '@yourgpt/copilot-sdk/react';

export function Providers({ children }: { children: React.ReactNode }) {
  return (
    <CopilotProvider runtimeUrl="/api/chat">
      {children}
    </CopilotProvider>
  );
}

Switching Providers

Change the import and model - frontend stays the same:

app/api/chat/route.ts
import { streamText } from '@yourgpt/llm-sdk';

// OpenAI
import { openai } from '@yourgpt/llm-sdk/openai';
const model = openai('gpt-4o');

// Anthropic
import { anthropic } from '@yourgpt/llm-sdk/anthropic';
const model = anthropic('claude-3-5-sonnet-20241022');

// Google
import { google } from '@yourgpt/llm-sdk/google';
const model = google('gemini-2.0-flash');

// xAI (Grok)
import { xai } from '@yourgpt/llm-sdk/xai';
const model = xai('grok-3-fast-beta');

// Use any model
const result = await streamText({
  model,
  messages,
});

Your tools, UI, and all frontend code remain unchanged. The SDK normalizes responses across providers.


Available Providers

ProviderImportSDK RequiredExample
OpenAI@yourgpt/llm-sdk/openaiopenaiopenai('gpt-4o')
Anthropic@yourgpt/llm-sdk/anthropic@anthropic-ai/sdkanthropic('claude-3-5-sonnet-20241022')
Google@yourgpt/llm-sdk/googleopenaigoogle('gemini-2.0-flash')
xAI@yourgpt/llm-sdk/xaiopenaixai('grok-3-fast-beta')

Why Only 2 SDKs?

Most LLM providers now offer OpenAI-compatible APIs. This means you only need:

SDKProviders
openaiOpenAI, Google Gemini, xAI Grok, Azure OpenAI, Groq, Together AI, Ollama
@anthropic-ai/sdkAnthropic Claude (native SDK for full features)

Google and xAI use OpenAI-compatible endpoints. We automatically configure the correct baseURL for each provider.


Provider Comparison

ProviderSpeedQualityCostBest For
OpenAIFastExcellent$$General use
AnthropicMediumExcellent$$Long context, safety
GoogleFastVery Good$Multimodal
xAIUltra FastExcellent$Speed-critical apps

On this page