Overview
DeepSeek provides access to their latest models through their OpenAI-compatible API. These models are multilingual and text-only, making them suitable for a variety of agent applications.
Usage
Use the OpenAI plugin's with_deepseek
method to set the default agent session LLM to DeepSeek:
pip install "livekit-agents[openai]~=1.2"
pnpm add @livekit/agents-plugin-openai@1.x
Set the following environment variable in your .env
file:
DEEPSEEK_API_KEY=<your-deepseek-api-key>
from livekit.plugins import openaisession = AgentSession(llm=openai.LLM.with_deepseek(model="deepseek-chat", # this is DeepSeek-V3),)
import * as openai from '@livekit/agents-plugin-openai';const session = new voice.AgentSession({llm: openai.LLM.withDeepSeek({model: "deepseek-chat", // this is DeepSeek-V3})});
Parameters
This section describes some of the available parameters. For a complete reference of all available parameters, see the plugin reference links in the Additional resources section.
DeepSeek model to use. See models and pricing for a complete list.
Controls the randomness of the model's output. Higher values, for example 0.8, make the output more random, while lower values, for example 0.2, make it more focused and deterministic.
Valid values are between 0
and 2
.
Controls whether the model can make multiple tool calls in parallel. When enabled, the model can make multiple tool calls simultaneously, which can improve performance for complex tasks.
Controls how the model uses tools. Set to 'auto' to let the model decide, 'required' to force tool usage, or 'none' to disable tool usage.
Additional resources
The following links provide more information about the DeepSeek LLM integration.