xAI LLM integration guide

How to use xAI LLM with LiveKit Agents.

Overview

xAI provides access to Grok models through their OpenAI-compatible API. These models are multilingual and support multimodal capabilities, making them suitable for a variety of agent applications.

Usage

Install the OpenAI plugin to add xAI support:

pip install "livekit-agents[openai]~=1.0"

Set the following environment variable in your .env file:

XAI_API_KEY=<your-xai-api-key>

Create a Grok LLM using the with_x_ai method:

from livekit.plugins import openai
session = AgentSession(
llm=openai.LLM.with_x_ai(
model="grok-2-public",
temperature=1.0,
),
# ... tts, stt, vad, turn_detection, etc.
)

Parameters

This section describes some of the available parameters. For a complete reference of all available parameters, see the method reference.

modelstr | XAIChatModelsOptionalDefault: grok-2-public

Grok model to use. To learn more, see the xAI Grok models page.

temperaturefloatOptionalDefault: 1.0

Controls the randomness of the model's output. Higher values (e.g., 0.8) make the output more random, while lower values (e.g., 0.2) make it more focused and deterministic.

Valid values are between 0 and 2. To learn more, see the optional parameters for Chat completions

parallel_tool_callsboolOptional

Controls whether the model can make multiple tool calls in parallel. When enabled, the model can make multiple tool calls simultaneously, which can improve performance for complex tasks.

tool_choiceToolChoice | Literal['auto', 'required', 'none']OptionalDefault: auto

Controls how the model uses tools. Set to 'auto' to let the model decide, 'required' to force tool usage, or 'none' to disable tool usage.

The following links provide more information about the xAI Grok LLM integration.