Skip to main content

Mistral AI LLM plugin guide

How to integrate Mistral AI's La Plateforme inference service with LiveKit Agents.

Overview

This plugin allows you to use Mistral AI as an LLM provider for your voice agents.

Quick reference

This section includes a basic usage example and some reference material. For links to more detailed documentation, see Additional resources.

Installation

Install the LiveKit Mistral AI plugin from PyPI:

pip install "livekit-agents[mistralai]~=1.2"

Authentication

The Mistral AI integration requires a Mistral AI API key.

Set the MISTRAL_API_KEY in your .env file.

Usage

Use Mistral AI within an AgentSession or as a standalone LLM service. For example, you can use this LLM in the Voice AI quickstart.

from livekit.plugins import mistralai
session = AgentSession(
llm=mistralai.LLM(
model="mistral-medium-latest"
),
# ... tts, stt, vad, turn_detection, etc.
)

Parameters

This section describes some of the available parameters. See the plugin reference for a complete list of all available parameters.

modelstring | ChatModelsOptionalDefault: ministral-8b-2410

Which Mistral AI model to use. You can pass a string or a typed enum from ChatModels.

temperaturefloatOptional

Controls the randomness of the model's output. Higher values, for example 0.8, make the output more random, while lower values, for example 0.2, make it more focused and deterministic.

Additional resources

The following resources provide more information about using Mistral AI with LiveKit Agents.