Skip to main content

OVHCloud LLM plugin guide

How to use the OVHCloud LLM plugin for LiveKit Agents.

Available in
Python
|
Node.js

Overview

This plugin allows you to use OVHCloud AI Endpoints as an LLM provider for your voice agents. OVHCloud compatibility is provided by the OpenAI plugin using the Chat Completions API format.

Quick reference

This section includes a basic usage example and some reference material. For links to more detailed documentation, see Additional resources.

Installation

Install the OpenAI plugin to add OVHCloud AI Endpoints support:

uv add "livekit-agents[openai]~=1.3"
pnpm add @livekit/agents-plugin-openai@1.x

Authentication

The OVHCloud AI Endpoints plugin requires an API key. You can generate one by creating a new Public Cloud project, then navigate to AI Endpoints > API key.

Set OVHCLOUD_API_KEY in your .env file.

Usage

Use OVHCloud AI Endpoints LLM in your AgentSession or as a standalone LLM service. For example, you can use this LLM in the Voice AI quickstart.

from livekit.plugins import openai
session = AgentSession(
llm=openai.LLM.with_ovhcloud(
model="gpt-oss-120b",
),
# ... tts, stt, vad, turn_detection, etc.
)
import * as openai from '@livekit/agents-plugin-openai';
const session = new voice.AgentSession({
llm: new openai.LLM.withOVHcloud(
model: "gpt-oss-120b"
),
// ... tts, stt, vad, turn_detection, etc.
});

Parameters

This section describes some of the available parameters. See the plugin reference links in the Additional resources section for a complete list of all available parameters.

modelstringOptionalDefault: gpt-oss-120b

Model to use for inference. To learn more, see supported models.

temperaturefloatOptionalDefault: 1.0

Controls the randomness of the model's output. Higher values, for example 0.8, make the output more random, while lower values, for example 0.2, make it more focused and deterministic.

Valid values are between 0 and 1.

parallel_tool_callsboolOptional

Controls whether the model can make multiple tool calls in parallel. When enabled, the model can make multiple tool calls simultaneously, which can improve performance for complex tasks.

tool_choiceToolChoice | Literal['auto', 'required', 'none']OptionalDefault: auto

Controls how the model uses tools. Set to 'auto' to let the model decide, 'required' to force tool usage, or 'none' to turn off tool usage.

Additional resources

The following resources provide more information about using OVHCloud AI Endpoints with LiveKit Agents.