Overview
This plugin allows you to use LangGraph and other graph-based LangChain agents as an LLM provider for your voice agents.
Installation
Install the LiveKit LangChain plugin from PyPI:
uv add "livekit-agents[langchain]~=1.4"
Usage
To use LangGraph workflows within an AgentSession, wrap them with the LLMAdapter. For example, you can use this LLM in the Voice AI quickstart.
from langgraph.graph import StateGraphfrom livekit.agents import AgentSession, Agentfrom livekit.plugins import langchain# Define your LangGraph workflowdef create_workflow():workflow = StateGraph(...)# Add your nodes and edgesreturn workflow.compile()# Use the workflow as an LLMsession = AgentSession(llm=langchain.LLMAdapter(graph=create_workflow()),# ... stt, tts, vad, turn_handling, etc.)
The LLMAdapter automatically converts the LiveKit chat context to LangChain messages. The mapping is as follows:
systemanddevelopermessages toSystemMessageusermessages toHumanMessageassistantmessages toAIMessage
Parameters
This section describes the available parameters for the LLMAdapter. See the plugin reference for a complete list of all available parameters.
graphPregelProtocolThe LangGraph workflow to use as an LLM. Must be a locally compiled graph. To learn more, see Graph Definitions.
configRunnableConfig | NoneDefault: NoneConfiguration options for the LangGraph workflow execution. This can include runtime configuration, callbacks, and other LangGraph-specific options. To learn more, see RunnableConfig.
Supported LangChain agent types
The LiveKit LangChain plugin supports LangGraph and other graph-based LangChain agents:
- Agents built with
create_agent. - Deep agents built with
create_deep_agent.
All of these return a CompiledStateGraph (Pregel-compatible), which the LLMAdapter accepts directly.
To add voice to an existing LangChain agent, pass the compiled graph to LLMAdapter and use it as the Agent's LLM: llm=langchain.LLMAdapter(graph=your_compiled_graph).
The plugin does not support non-graph patterns such as plain LCEL chains (prompt | llm) or bare chat models.
For complete examples including LangGraph, LangChain agents, and deep agents, see the recipes page.
Latency
This plugin uses LangGraph's streaming mode to minimize time-to-first-token as much as possible, but take care when porting LangChain workflows that were not originally designed for voice use cases. For more information on handling long-running operations and providing a better user experience, see the user feedback documentation.
Additional resources
The following resources provide more information about using LangChain with LiveKit Agents.
Python package
The livekit-plugins-langchain package on PyPI.
Plugin reference
Reference for the LangChain LLM adapter.
GitHub repo
View the source or contribute to the LiveKit LangChain plugin.
LangChain docs
LangChain documentation and tutorials.
LangGraph docs
LangGraph documentation for building stateful workflows.
Example apps
Examples showing LangChain integration with LiveKit.