Overview
LiveKit's Mistral AI plugin provides access to their instruction-tuned, code, and vision chat models through La Plateforme. You can use Mistral AI with LiveKit Agents for conversation, reasoning, and other text-generation tasks.
The Mistral AI plugin also supports STT models.
Quick reference
This section includes a basic usage example and some reference material. For links to more detailed documentation, see Additional resources.
Installation
Install the LiveKit Mistral AI plugin from PyPI:
pip install livekit-plugins-mistralai
Authentication
The Mistral AI integration requires a Mistral AI API key.
Set the MISTRAL_API_KEY
in your .env
file.
Usage
Use Mistral AI within an AgentSession
or as a standalone LLM service. For example, you can use this LLM in the Voice AI quickstart.
from livekit.plugins import openaisession = AgentSession(llm=mistralai.LLM(model="mistral-medium-latest"),# ... tts, stt, vad, turn_detection, etc.)
Parameters
This section describes some of the available parameters. See the plugin reference for a complete list of all available parameters.
Which Mistral AI model to use. You can pass a string or a typed enum from ChatModels
.
A measure of randomness in output. A lower value results in more predictable output, while a higher value results in more creative output.
Additional resources
The following resources provide more information about using Mistral AI with LiveKit Agents.
Python package
The livekit-plugins-mistralai
package on PyPI.
Plugin reference
Reference for the Mistral AI LLM plugin.
GitHub repo
View the source or contribute to the LiveKit Mistral AI LLM plugin.
Mistral AI STT docs
Mistral AI STT documentation.
Mistral AI docs
Mistral AI platform documentation.
Voice AI quickstart
Get started with LiveKit Agents and Mistral AI.