Overview
Anthropic provides Claude, an advanced AI assistant with capabilities including advanced reasoning, vision analysis, code generation, and multilingual processing. With LiveKit's Anthropic integration and the Agents framework, you can build sophisticated voice AI applications.
You can also use Claude with Amazon Bedrock.
Quick reference
This section includes a basic usage example and some reference material. For links to more detailed documentation, see Additional resources.
Installation
Install the plugin from PyPI:
pip install "livekit-agents[anthropic]~=1.0"
Authentication
The Anthropic plugin requires an Anthropic API key.
Set ANTHROPIC_API_KEY
in your .env
file.
Usage
Use Claude within an AgentSession
or as a standalone LLM service. For example, you can use this LLM in the Voice AI quickstart.
from livekit.plugins import anthropicsession = AgentSession(llm=anthropic.LLM(model="claude-3-5-sonnet-20241022",temperature=0.8,),# ... tts, stt, vad, turn_detection, etc.)
Parameters
This section describes some of the available parameters. See the plugin reference for a complete list of all available parameters.
Model to use. For a full list of available models, see the Model options.
The maximum number of tokens to generate before stopping. To learn more, see the Anthropic API reference.
A measure of randomness in output. A lower value results in more predictable output, while a higher value results in more creative output.
Valid values are between 0
and 1
. To learn more, see the Anthropic API reference.
Set to true to parallelize tool calls.
Specifies whether to use tools during response generation.
Additional resources
The following links provide more information about the Anthropic LLM plugin.
Python package
The livekit-plugins-anthropic
package on PyPI.
Plugin reference
Reference for the Anthropic LLM plugin.
GitHub repo
View the source or contribute to the LiveKit Anthropic LLM plugin.
Anthropic docs
Anthropic Claude docs.
Voice AI quickstart
Get started with LiveKit Agents and Anthropic.