Overview
LiveKit Inference includes support for the following OpenAI models. Pricing information for each model and provider is available on the pricing page.
| Model name | Model ID | Providers |
|---|---|---|
| GPT-4o | openai/gpt-4o | azureopenai |
| GPT-4o mini | openai/gpt-4o-mini | azureopenai |
| GPT-4.1 | openai/gpt-4.1 | azureopenai |
| GPT-4.1 mini | openai/gpt-4.1-mini | azureopenai |
| GPT-4.1 nano | openai/gpt-4.1-nano | azureopenai |
| GPT-5 | openai/gpt-5 | azureopenai |
| GPT-5 mini | openai/gpt-5-mini | azureopenai |
| GPT-5 nano | openai/gpt-5-nano | azureopenai |
| GPT OSS 120B | openai/gpt-oss-120b | basetengroq(Cerebras coming soon) |
Usage
To use OpenAI, pass the model id to the llm argument in your AgentSession. LiveKit Inference manages the connection to the model automatically and picks the best available provider.
from livekit.agents import AgentSessionsession = AgentSession(llm="openai/gpt-4.1-mini",# ... tts, stt, vad, turn_detection, etc.)
import { AgentSession } from '@livekit/agents';session = new AgentSession({llm: "openai/gpt-4.1-mini",// ... tts, stt, vad, turn_detection, etc.});
Parameters
To customize additional parameters, or specify the exact provider to use, use the LLM class from the inference module.
from livekit.agents import AgentSession, inferencesession = AgentSession(llm=inference.LLM(model="openai/gpt-5-mini",provider="openai",extra_kwargs={"reasoning_effort": "low"}),# ... tts, stt, vad, turn_detection, etc.)
import { AgentSession, inference } from '@livekit/agents';session = new AgentSession({llm: new inference.LLM({model: "openai/gpt-5-mini",provider: "openai",modelOptions: {reasoning_effort: "low"}}),// ... tts, stt, vad, turn_detection, etc.});
The model to use for the LLM. Must be a model from OpenAI.
The provider to use for the LLM. Must be openai to use OpenAI models and other parameters.
Additional parameters to pass to the provider's Chat Completions API, such as reasoning_effort or max_completion_tokens.
In Node.js this parameter is called modelOptions.
Additional resources
The following links provide more information about OpenAI in LiveKit Inference.
OpenAI Plugin
Plugin to use your own OpenAI account instead of LiveKit Inference.
Azure OpenAI Plugin
Plugin to use your own Azure OpenAI account instead of LiveKit Inference.
OpenAI docs
Official OpenAI platform documentation.
Azure OpenAI docs
Azure OpenAI documentation, for OpenAI proprietary models.
Baseten docs
Baseten's official Model API documentation, for GPT-OSS models.
Groq docs
Groq's official API documentation, for GPT-OSS models.
OpenAI ecosystem overview
Overview of the entire OpenAI ecosystem and LiveKit Agents integration.