Overview
This plugin allows you to use Together AI as an LLM provider for your voice agents. Together AI compatibility is provided by the OpenAI plugin using the Together AI Chat Completions API.
Usage
Install the OpenAI plugin to add Together AI support:
uv add "livekit-agents[openai]~=1.4"
pnpm add@livekit/agents-plugin-openai@1.x
Set the following environment variable in your .env file:
TOGETHER_API_KEY=<your-together-api-key>
Create a Together AI LLM using the with_together method:
from livekit.plugins import openaisession = AgentSession(llm=openai.LLM.with_together(model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",),# ... tts, stt, vad, turn_handling, etc.)
import * as openai from '@livekit/agents-plugin-openai';const session = new voice.AgentSession(llm: new openai.LLM.withTogether(model: "meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",),// ... tts, stt, vad, turnHandling, etc.);
Parameters
This section describes some of the available parameters. For a complete reference of all available parameters, see the plugin reference links in the Additional resources section.
modelstr | TogetherChatModelsDefault: meta-llama/Meta-Llama-3.1-8B-Instruct-TurboModel to use for inference. To learn more, see supported models.
temperaturefloatDefault: 1.0Sampling temperature that controls the randomness of the model's output. Higher values make the output more random, while lower values make it more focused and deterministic. Range of valid values can vary by model.
Valid values are between 0 and 1.
parallel_tool_callsboolControls whether the model can make multiple tool calls in parallel. When enabled, the model can make multiple tool calls simultaneously, which can improve performance for complex tasks.
tool_choiceToolChoice | Literal['auto', 'required', 'none']Default: autoControls how the model uses tools. String options are as follows:
'auto': Let the model decide.'required': Force tool usage.'none': Disable tool usage.
Additional resources
The following links provide more information about the Together AI LLM integration.