Skip to main content

OpenAI-compatible LLMs

Connect to any OpenAI-compatible LLM provider in your voice agents.

Overview

Many LLM providers expose an API that's compatible with OpenAI's format, which means LiveKit's OpenAI plugin can connect to them without a separate integration. For popular providers, the plugin includes convenience methods that pre-configure the endpoint for you. For any other compatible provider, you can pass a base_url directly.

These providers use the Chat Completions API format via openai.LLM(). If you're connecting directly to OpenAI's platform, use openai.responses.LLM() instead. To learn more, see OpenAI API modes for details on the Responses API.

Installation

Install the OpenAI plugin to use any of these providers:

uv add "livekit-agents[openai]~=1.4"
pnpm add @livekit/agents-plugin-openai@1.x

Convenience methods

The OpenAI plugin includes wrapper methods for the following providers. Use these when available to handle endpoint configuration automatically:

Using a custom endpoint

For providers that don't have a convenience method, but expose an OpenAI-compatible API, pass their endpoint and API key directly to the openai.LLM constructor:

from livekit.plugins import openai
session = AgentSession(
llm=openai.LLM(
model="your-model-name",
base_url="https://your-provider.com/v1",
api_key="your-api-key",
),
# ... tts, stt, vad, turn_detection, etc.
)
import * as openai from '@livekit/agents-plugin-openai';
const session = new voice.AgentSession({
llm: new openai.LLM({
model: 'your-model-name',
baseURL: 'https://your-provider.com/v1',
apiKey: 'your-api-key',
}),
// ... tts, stt, vad, turn_detection, etc.
});