Skip to main content

xAI LLM plugin guide

How to use xAI's Grok models with LiveKit Agents.

Available in
Python
|
Node.js

Overview

This plugin allows you to use xAI as an LLM provider for your voice agents. The xAI plugin supports the Responses API, which provides support for xAI's provider tools (WebSearch, FileSearch, XSearch) and is the recommended endpoint.

Quick reference

This section includes a basic usage example and some reference material. For links to more detailed documentation, see Additional resources.

Installation

Install the xAI plugin to add xAI support:

uv add "livekit-agents[xai]~=1.3"

Authentication

Set the following environment variable in your .env file:

XAI_API_KEY=<your-xai-api-key>

Usage

Use xAI within an AgentSession or as a standalone LLM service. For example, you can use this LLM in the Voice AI quickstart

from livekit.plugins import xai
# Use Responses API (recommended)
session = AgentSession(
llm=xai.responses.LLM(
model="grok-4-1-fast-non-reasoning",
),
# ... tts, stt, vad, turn_detection, etc.
)

Parameters

This section describes some of the available parameters. For a complete reference of all available parameters, see the plugin reference links in the Additional resources section.

modelstrOptionalDefault: grok-4-1-fast-non-reasoning

Grok model to use. To learn more, see the xAI Grok models page.

temperaturefloatOptionalDefault: 1.0

Controls the randomness of the model's output. Higher values, for example 0.8, make the output more random, while lower values, for example 0.2, make it more focused and deterministic.

Valid values are between 0 and 2. To learn more, see the optional parameters for Responses

parallel_tool_callsboolOptional

Controls whether the model can make multiple tool calls in parallel. When enabled, the model can make multiple tool calls simultaneously, which can improve performance for complex tasks.

tool_choiceToolChoice | Literal['auto', 'required', 'none']OptionalDefault: auto

Controls how the model uses tools. Set to 'auto' to let the model decide, 'required' to force tool usage, or 'none' to disable tool usage.

Additional resources

The following links provide more information about the xAI Grok LLM integration.