Skip to main content

LangChain integration guide

How to use LangGraph workflows and LangChain agents with LiveKit.

Available in
Python

Overview

This plugin allows you to use LangGraph and other graph-based LangChain agents as an LLM provider for your voice agents.

Installation

Install the LiveKit LangChain plugin from PyPI:

uv add "livekit-agents[langchain]~=1.4"

Usage

To use LangGraph workflows within an AgentSession, wrap them with the LLMAdapter. For example, you can use this LLM in the Voice AI quickstart.

from langgraph.graph import StateGraph
from livekit.agents import AgentSession, Agent
from livekit.plugins import langchain
# Define your LangGraph workflow
def create_workflow():
workflow = StateGraph(...)
# Add your nodes and edges
return workflow.compile()
# Use the workflow as an LLM
session = AgentSession(
llm=langchain.LLMAdapter(
graph=create_workflow()
),
# ... stt, tts, vad, turn_handling, etc.
)

The LLMAdapter automatically converts the LiveKit chat context to LangChain messages. The mapping is as follows:

  • system and developer messages to SystemMessage
  • user messages to HumanMessage
  • assistant messages to AIMessage

Parameters

This section describes the available parameters for the LLMAdapter. See the plugin reference for a complete list of all available parameters.

graph
Required
PregelProtocol

The LangGraph workflow to use as an LLM. Must be a locally compiled graph. To learn more, see Graph Definitions.

configRunnableConfig | NoneDefault: None

Configuration options for the LangGraph workflow execution. This can include runtime configuration, callbacks, and other LangGraph-specific options. To learn more, see RunnableConfig.

Supported LangChain agent types

The LiveKit LangChain plugin supports LangGraph and other graph-based LangChain agents:

All of these return a CompiledStateGraph (Pregel-compatible), which the LLMAdapter accepts directly.

To add voice to an existing LangChain agent, pass the compiled graph to LLMAdapter and use it as the Agent's LLM: llm=langchain.LLMAdapter(graph=your_compiled_graph).

The plugin does not support non-graph patterns such as plain LCEL chains (prompt | llm) or bare chat models.

For complete examples including LangGraph, LangChain agents, and deep agents, see the recipes page.

Latency

This plugin uses LangGraph's streaming mode to minimize time-to-first-token as much as possible, but take care when porting LangChain workflows that were not originally designed for voice use cases. For more information on handling long-running operations and providing a better user experience, see the user feedback documentation.

Additional resources

The following resources provide more information about using LangChain with LiveKit Agents.