Overview
This plugin allows you to use Mistral AI as an LLM provider for your voice agents.
Installation
Install the LiveKit Mistral AI plugin from PyPI:
uv add "livekit-agents[mistralai]~=1.5"
Authentication
The Mistral AI integration requires a Mistral AI API key.
Set the MISTRAL_API_KEY in your .env file.
Usage
Use Mistral AI within an AgentSession or as a standalone LLM service. For example, you can use this LLM in the Voice AI quickstart.
from livekit.plugins import mistralaisession = AgentSession(llm=mistralai.LLM(model="mistral-medium-latest"),# ... tts, stt, vad, turn_handling, etc.)
Parameters
This section describes some of the available parameters. See the plugin reference for a complete list of all available parameters.
modelstring | ChatModelsDefault: ministral-8b-latestWhich Mistral AI model to use. You can pass a string or a typed enum from ChatModels.
temperaturefloatSampling temperature that controls the randomness of the model's output. Higher values make the output more random, while lower values make it more focused and deterministic. Range of valid values can vary by model.
top_pfloatNucleus sampling parameter.
presence_penaltyfloatPenalize new tokens based on their presence in the text so far.
frequency_penaltyfloatPenalize new tokens based on their frequency in the text so far.
random_seedintRandom seed for reproducibility.
tool_choiceToolChoice | Literal['auto', 'required', 'none']Controls how the model uses tools. String options are as follows:
'auto': Let the model decide.'required': Force tool usage.'none': Disable tool usage.
max_completion_tokensfloatThe maximum number of tokens the LLM can output.
Provider tools
The Mistral Conversations API supports the following provider tools that enable the model to use built-in capabilities executed on the model server. These tools can be used alongside function tools defined in your agent's codebase.
| Tool | Description | Parameters |
|---|---|---|
WebSearch | Search the internet for up-to-date information. | None |
DocumentLibrary | Search uploaded document collections via Libraries. | library_ids (required) |
CodeInterpreter | Write and Execute Python code in a sandboxed environment. | None |
from livekit.plugins import mistralaiagent = MyAgent(llm=mistralai.LLM(model="mistral-medium-latest"),tools=[mistralai.tools.WebSearch()], # replace with any supported provider tool)
Additional resources
The following resources provide more information about using Mistral AI with LiveKit Agents.
Python package
The livekit-plugins-mistralai package on PyPI.
Plugin reference
Reference for the Mistral AI LLM plugin.
GitHub repo
View the source or contribute to the LiveKit Mistral AI LLM plugin.
Mistral AI STT docs
Mistral AI STT documentation.
Mistral AI docs
Mistral AI platform documentation.
Voice AI quickstart
Get started with LiveKit Agents and Mistral AI.