Skip to main content

Mistral AI integration guide

How to integrate Mistral AI's La Plateforme inference service with LiveKit Agents.

Overview

LiveKit's Mistral AI plugin provides access to their instruction-tuned, code, and vision chat models through La Plateforme. You can use Mistral AI with LiveKit Agents for conversation, reasoning, and other text-generation tasks.

The Mistral AI plugin also supports STT models.

Quick reference

This section includes a basic usage example and some reference material. For links to more detailed documentation, see Additional resources.

Installation

Install the LiveKit Mistral AI plugin from PyPI:

pip install livekit-plugins-mistralai

Authentication

The Mistral AI integration requires a Mistral AI API key.

Set the MISTRAL_API_KEY in your .env file.

Usage

Use Mistral AI within an AgentSession or as a standalone LLM service. For example, you can use this LLM in the Voice AI quickstart.

from livekit.plugins import openai
session = AgentSession(
llm=mistralai.LLM(
model="mistral-medium-latest"
),
# ... tts, stt, vad, turn_detection, etc.
)

Parameters

This section describes some of the available parameters. See the plugin reference for a complete list of all available parameters.

modelstring | ChatModelsOptionalDefault: ministral-8b-2410

Which Mistral AI model to use. You can pass a string or a typed enum from ChatModels.

temperaturefloatOptional

A measure of randomness in output. A lower value results in more predictable output, while a higher value results in more creative output.

Additional resources

The following resources provide more information about using Mistral AI with LiveKit Agents.