Skip to main content

Cerebras LLM integration guide

How to use the Cerebras inference with LiveKit Agents.

Available in
Python
|
Node.js

Overview

Cerebras provides access to Llama 3.1 and 3.3 models through their inference API. These models are multilingual and text-only, making them suitable for a variety of agent applications.

Usage

Install the OpenAI plugin to add Cerebras support:

pip install "livekit-agents[openai]~=1.2"
pnpm add @livekit/agents-plugin-openai@1.x

Set the following environment variable in your .env file:

CEREBRAS_API_KEY=<your-cerebras-api-key>

Create a Cerebras LLM using the with_cerebras method:

from livekit.plugins import openai
session = AgentSession(
llm=openai.LLM.with_cerebras(
model="llama3.1-8b",
),
# ... tts, stt, vad, turn_detection, etc.
)
import * as openai from '@livekit/agents-plugin-openai';
const session = new voice.AgentSession({
llm: openai.LLM.withCerebras({
model: "llama3.1-8b",
}),
// ... tts, stt, vad, turn_detection, etc.
});

Parameters

This section describes some of the available parameters. See the plugin reference links in the Additional resources section for a complete list of all available parameters.

modelstr | CerebrasChatModelsOptionalDefault: llama3.1-8b

Model to use for inference. To learn more, see supported models.

temperaturefloatOptionalDefault: 1.0

Controls the randomness of the model's output. Higher values, for example 0.8, make the output more random, while lower values, for example 0.2, make it more focused and deterministic.

Valid values are between 0 and 1.5. To learn more, see the Cerebras documentation.

parallel_tool_callsboolOptional

Controls whether the model can make multiple tool calls in parallel. When enabled, the model can make multiple tool calls simultaneously, which can improve performance for complex tasks.

tool_choiceToolChoice | Literal['auto', 'required', 'none']OptionalDefault: auto

Controls how the model uses tools. Set to 'auto' to let the model decide, 'required' to force tool usage, or 'none' to disable tool usage.

Additional resources

The following links provide more information about the Cerebras LLM integration.