Skip to main content

Keyframe virtual avatar integration guide

How to use the Keyframe virtual avatar plugin for LiveKit Agents.

Available in
Python

Overview

Keyframe Labs provides hyper-realistic, emotionally expressive avatars that can participate in live, interactive conversations. You can use the open source Keyframe integration for LiveKit Agents to enable seamless integration of Keyframe avatars into your voice AI app.

Quick reference

This section includes a basic usage example and some reference material. For links to more detailed documentation, see Additional resources.

Installation

uv add "livekit-agents[keyframe]~=1.4"

Authentication

The Keyframe plugin requires a Keyframe API key.

Set KEYFRAME_API_KEY in your .env file.

Persona setup

The Keyframe plugin requires a persona, a pre-configured avatar identity. You can specify the persona by ID or by slug, passed by persona_id or persona_slug respectively. You must provide exactly one of these.

Persona slug

A slug is a human-readable identifier in the format "public:<persona_name>". Browse available public personas at the Keyframe platform.

avatar = keyframe.AvatarSession(
persona_slug="public:cosmo_persona-1.5-live",
)

Persona ID

A persona ID is a UUID that uniquely identifies a persona. You can find persona IDs in the Keyframe platform.

avatar = keyframe.AvatarSession(
persona_id="ab85a2a0-0555-428d-87b2-ff3019a58b93",
)

Usage

Use the plugin in an AgentSession. For example, you can use this avatar in the Voice AI quickstart.

from livekit import agents
from livekit.agents import AgentServer, AgentSession
from livekit.plugins import keyframe
server = AgentServer()
@server.rtc_session(agent_name="my-agent")
async def my_agent(ctx: agents.JobContext):
session = AgentSession(
# ... stt, llm, tts, etc.
)
avatar = keyframe.AvatarSession(
persona_slug="public:cosmo_persona-1.5-live", # or use persona_id. See "Persona setup" for details.
)
# Start the avatar and wait for it to join
await avatar.start(session, room=ctx.room)
# Start your agent session with the user
await session.start(
# ... room, agent, room_options, etc.
)

Preview the avatar in the Agents Playground or a frontend starter app that you build.

Emotion control

persona-1.5-live models only

Emotion control is only available for personas powered by persona-1.5-live.

The Keyframe plugin supports dynamic emotion control. You can change the avatar's facial expression at any time during a conversation using set_emotion():

await avatar.set_emotion("happy") # "neutral", "happy", "sad", "angry"

To let the LLM control the avatar's expression automatically, register it as a function tool on your agent:

from livekit.agents import Agent, RunContext, function_tool
from livekit.plugins.keyframe import Emotion
class AvatarAgent(Agent):
def __init__(self, avatar: keyframe.AvatarSession) -> None:
super().__init__(
instructions=(
"You are a friendly voice assistant with an avatar. "
"Use the set_emotion tool to change your facial expression "
"whenever the conversation mood shifts."
),
)
self._avatar = avatar
@function_tool()
async def set_emotion(self, context: RunContext, emotion: Emotion) -> str:
"""Set the avatar's facial expression to match the conversation mood.
Args:
emotion: The emotion to express. One of 'neutral', 'happy', 'sad', or 'angry'.
"""
await self._avatar.set_emotion(emotion)
return f"Emotion set to {emotion}"

Parameters

This section describes some of the available parameters. See the plugin reference for a complete list of all available parameters.

api_keystringRequired

Keyframe API key. Defaults to the KEYFRAME_API_KEY environment variable.

persona_idstringOptional

UUID of the Keyframe persona to use. Mutually exclusive with persona_slug. See Persona setup for details.

persona_slugstringOptional

Slug identifier for a Keyframe persona (e.g. "public:cosmo_persona-1.5-live"). Mutually exclusive with persona_id. See Persona setup for details.

Additional resources

The following resources provide more information about using Keyframe with LiveKit Agents.