Customize the LLM for your voice agent
The OpenAI plugin provides methods that allow you to use OpenAI API compatible LLMs. In most cases, you can call the method with no parameter values by setting the required environment variables and accepting default values. The minimal syntax for each method assumes the environment variables for required values (for example, API keys) are set.
Support for additional LLMs is available through other LiveKit plugins.
Example voice agent
Use the VoicePipelineAgent
class and the OpenAI plugin to specify the LLM. In this example, use Groq as the LLM:
Set the
GROQ_API_KEY
environment variable:export GROQ_API_KEY=<your_groq_api_key>Create an agent:
agent = VoicePipelineAgent(vad=ctx.proc.userdata["vad"],stt=deepgram.STT(),llm=openai.LLM.with_groq(),tts=cartesia.TTS(),chat_ctx=initial_ctx,)
Supported LLMs
The OpenAI plugin offers support for the following LLMs:
Azure | Cerebras | Deepseek |
Fireworks | Groq | Octo |
Ollama | Perplexity | Telnyx |
Together | xAI |
Select an LLM in the dropdown menu to view parameters and syntax:
Method name
with_azure
Syntax
The minimal syntax for this method assumes the environment variables for required values (for example, API keys) are set.
agent = VoicePipelineAgent(vad=ctx.proc.userdata["vad"],stt=deepgram.STT(),llm=openai.LLM.with_azure()tts=cartesia.TTS(),chat_ctx=initial_ctx,)
Parameters
The with_azure
method accepts the following parameters:
Parameter | Data type | Default value / Environment variable |
---|---|---|
model | String | gpt-4o |
azure_endpoint | String | AZURE_OPENAI_ENDPOINT |
azure_deployment | String | |
api_version | String | OPENAI_API_VERSION |
api_key | String | AZURE_OPENAI_API_KEY |
azure_ad_token | String | AZURE_OPENAI_AD_TOKEN |
azure_ad_token_provider | AsyncAzureADTokenProvider | |
organization | String | OPENAI_ORG_ID |
project | String | OPENAI_PROJECT_ID |
base_url | String | |
user | String | |
temperature | Float |