OpenAI API compatible LLMs

A guide on how to use the OpenAI plugin functions for LLMs.

Customize the LLM for your voice agent

The OpenAI plugin provides methods that allow you to use OpenAI API compatible LLMs. In most cases, you can call the method with no parameter values by setting the required environment variables and accepting default values. The minimal syntax for each method assumes the environment variables for required values (for example, API keys) are set.

Support for additional LLMs is available through other LiveKit plugins.

Example voice agent

Use the VoicePipelineAgent class and the OpenAI plugin to specify the LLM. In this example, use Groq as the LLM:

  1. Set the GROQ_API_KEY environment variable:

    export GROQ_API_KEY=<your_groq_api_key>
  2. Create an agent:

    agent = VoicePipelineAgent(
    vad=ctx.proc.userdata["vad"],
    stt=deepgram.STT(),
    llm=openai.LLM.with_groq(),
    tts=cartesia.TTS(),
    chat_ctx=initial_ctx,
    )

Supported LLMs

The OpenAI plugin offers support for the following LLMs:

AzureCerebrasDeepseek
FireworksGroqOcto
OllamaPerplexityTelnyx
TogetherxAI

Select an LLM in the dropdown menu to view parameters and syntax:

Method name

with_azure

Syntax

The minimal syntax for this method assumes the environment variables for required values (for example, API keys) are set.

agent = VoicePipelineAgent(
vad=ctx.proc.userdata["vad"],
stt=deepgram.STT(),
llm=openai.LLM.with_azure()
tts=cartesia.TTS(),
chat_ctx=initial_ctx,
)

Parameters

The with_azure method accepts the following parameters:

ParameterData typeDefault value / Environment variable
modelStringgpt-4o
azure_endpointStringAZURE_OPENAI_ENDPOINT
azure_deploymentString
api_versionStringOPENAI_API_VERSION
api_keyStringAZURE_OPENAI_API_KEY
azure_ad_tokenStringAZURE_OPENAI_AD_TOKEN
azure_ad_token_providerAsyncAzureADTokenProvider
organizationStringOPENAI_ORG_ID
projectStringOPENAI_PROJECT_ID
base_urlString
userString
temperatureFloat