OpenAI LLM integration guide

How to use the OpenAI LLM plugin for LiveKit Agents.

Overview

OpenAI provides powerful language models like gpt-4o and o1. With LiveKit's OpenAI integration and the Agents framework, you can build sophisticated voice AI applications using their industry-leading models.

Tip

Using Azure OpenAI? See our Azure OpenAI LLM guide.

Quick reference

This section includes a basic usage example and some reference material. For links to more detailed documentation, see Additional resources.

Installation

Install the plugin from PyPI:

pip install "livekit-agents[openai]~=1.0rc"

Authentication

The OpenAI plugin requires an OpenAI API key.

Set OPENAI_API_KEY in your .env file.

Usage

Use OpenAI within an AgentSession or as a standalone LLM service. For example, you can use this LLM in the Voice AI quickstart.

from livekit.plugins import openai
session = AgentSession(
llm=openai.LLM(
model="gpt-4o-mini"
),
# ... tts, stt, vad, turn_detection, etc.
)

Parameters

This section describes some of the available parameters. See the plugin reference for a complete list of all available parameters.

modelstringOptionalDefault: gpt-4o-mini

The model to use for the LLM. For more information, see the OpenAI documentation.

temperaturefloatOptionalDefault: 0.8

A measure of randomness in output. A lower value results in more predictable output, while a higher value results in more creative output.

tool_choiceToolChoice | Literal['auto', 'required', 'none']OptionalDefault: auto

Specifies whether to use tools during response generation.

Additional resources

The following resources provide more information about using OpenAI with LiveKit Agents.