Overview
Groq provides fast LLM inference using open models from Llama, DeepSeek, and more. With LiveKit's Groq integration and the Agents framework, you can build low-latency voice AI applications.
Quick reference
This section includes a basic usage example and some reference material. For links to more detailed documentation, see Additional resources.
Installation
Install the plugin from PyPI:
pip install "livekit-agents[groq]~=1.0rc"
Authentication
The Groq plugin requires a Groq API key.
Set GROQ_API_KEY
in your .env
file.
Usage
Use a Groq LLM in your AgentSession
or as a standalone LLM service. For example, you can use this LLM in the Voice AI quickstart.
from livekit.plugins import groqsession = AgentSession(llm=groq.LLM(model="llama3-8b-8192"),# ... tts, stt, vad, turn_detection, etc.)
Parameters
This section describes some of the available parameters. For a complete reference of all available parameters, see the plugin reference.
Name of the LLM model to use. For all options, see the Groq model list.
A measure of randomness in output. A lower value results in more predictable output, while a higher value results in more creative output.
Set to true to parallelize tool calls.
Specifies whether to use tools during response generation.
Additional resources
The following resources provide more information about using Groq with LiveKit Agents.
Python package
The livekit-plugins-groq
package on PyPI.
Plugin reference
Reference for the Groq LLM plugin.
GitHub repo
View the source or contribute to the LiveKit Groq LLM plugin.
Groq docs
Groq docs.
Voice AI quickstart
Get started with LiveKit Agents and Groq.
Groq ecosystem overview
Overview of the entire Groq and LiveKit Agents integration.