DeepSeek LLM integration guide

How to use DeepSeek models with LiveKit Agents.

Overview

DeepSeek provides access to their latest models through their OpenAI-compatible API. These models are multilingual and text-only, making them suitable for a variety of agent applications.

Additional providers

DeepSeek models are also available through a number of other providers, such as Cerebras and Groq.

Usage

Use the OpenAI plugin's with_deepseek method to set the default agent session LLM to DeepSeek:

pip install "livekit-agents[openai]~=1.0"

Set the following environment variable in your .env file:

DEEPSEEK_API_KEY=<your-deepseek-api-key>
from livekit.plugins import openai
deepseek_llm = openai.LLM.with_deepseek(
model="deepseek-chat", # this is DeepSeek-V3
temperature=0.7
)

Parameters

This section describes some of the available parameters. For a complete reference of all available parameters, see the method reference.

modelstr | DeepSeekChatModelsOptionalDefault: deepseek-chat

DeepSeek model to use. See models and pricing for a complete list.

temperaturefloatOptionalDefault: 1.0

Controls the randomness of the model's output. Higher values (e.g., 0.8) make the output more random, while lower values (e.g., 0.2) make it more focused and deterministic.

Valid values are between 0 and 2.

parallel_tool_callsboolOptional

Controls whether the model can make multiple tool calls in parallel. When enabled, the model can make multiple tool calls simultaneously, which can improve performance for complex tasks.

tool_choiceToolChoice | Literal['auto', 'required', 'none']OptionalDefault: auto

Controls how the model uses tools. Set to 'auto' to let the model decide, 'required' to force tool usage, or 'none' to disable tool usage.

The following links provide more information about the DeepSeek LLM integration.