Skip to main content

Audio visualizer

Components for visualizing agent and user audio in your frontend.

Overview

Audio visualizer components give your voice agent a visual presence in your application. They render animated visualizations driven by two inputs: the audio track's volume levels and the agent's current state (listening, thinking, speaking). This combination means the visualizer responds naturally to conversation flow — animating during speech, settling during silence, and reflecting state transitions like thinking pauses.

This demo uses your device's microphone.

Code

Choosing a visualizer

Agents UI includes five visualizer variants, each with a distinct visual style. All share the same props interface (audioTrack, state, size), so you can swap between them without changing your code.

ComponentStyleBest for
AgentAudioVisualizerBarVertical bars that react to audio levels.Clean, minimal interfaces. Configurable bar count and size.
AgentAudioVisualizerGridA grid of cells that pulse with audio.Compact layouts where a subtle pattern works well.
AgentAudioVisualizerRadialA circular visualization that expands outward.Centered, prominent agent displays.
AgentAudioVisualizerWaveA flowing waveform line.Horizontal layouts or inline with text.
AgentAudioVisualizerAuraA glowing, organic aura designed in partnership with Unicorn Studio.Premium, immersive experiences with a distinctive look.

React example

Install a visualizer from Agents UI. This also installs all necessary dependencies, like @livekit/components-react.

pnpm dlx shadcn@latest add @agents-ui/agent-audio-visualizer-bar

Call the useVoiceAssistant hook and pass the audioTrack and state to the component. useVoiceAssistant must be called within an AgentSessionProvider context.

'use client';
import { useVoiceAssistant } from '@livekit/components-react';
import { AgentAudioVisualizerBar } from '@/components/agents-ui/agent-audio-visualizer-bar';
export function Demo() {
const { audioTrack, state } = useVoiceAssistant();
return (
<AgentAudioVisualizerBar
size="lg"
state={state}
barCount={5}
audioTrack={audioTrack}
/>
);
}

Check out the AgentControlBar, which provides a simple set of common UI controls for voice agent applications, and additional audio visualizer components.

Other platform visualizers

The LiveKit component SDKs for SwiftUI, Android Compose, and Flutter also include audio visualizer components.

First install the components package from https://github.com/livekit/components-swift.

Then use the AgentBarAudioVisualizer view to display the agent's audio and state:

struct AgentView: View {
// Load the room from the environment
@EnvironmentObject private var room: Room
// Find the first agent participant in the room
private var agentParticipant: RemoteParticipant? {
for participant in room.remoteParticipants.values {
if participant.kind == .agent {
return participant
}
}
return nil
}
// Reads the agent state property
private var agentState: AgentState {
agentParticipant?.agentState ?? .initializing
}
var body: some View {
AgentBarAudioVisualizer(audioTrack: agentParticipant?.firstAudioTrack, agentState: agentState, barColor: .primary, barCount: 5)
.id(agentParticipant?.firstAudioTrack?.id)
}
}

First install the components package from https://github.com/livekit/components-android.

Then use the rememberVoiceAssistant and VoiceAssistantBarVisualizer composables to display the visualizer, assuming you are within a RoomScope composable already.

import androidx.compose.foundation.layout.fillMaxWidth
import androidx.compose.foundation.layout.padding
import androidx.compose.runtime.Composable
import androidx.compose.ui.Modifier
import androidx.compose.ui.unit.dp
import io.livekit.android.compose.state.rememberVoiceAssistant
import io.livekit.android.compose.ui.audio.VoiceAssistantBarVisualizer
@Composable
fun AgentAudioVisualizer(modifier: Modifier = Modifier) {
// Get the voice assistant instance
val voiceAssistant = rememberVoiceAssistant()
// Display the audio visualization
VoiceAssistantBarVisualizer(
voiceAssistant = voiceAssistant,
modifier = modifier
.padding(8.dp)
.fillMaxWidth()
)
}

First install the components package from https://github.com/livekit/components-flutter.

flutter pub add livekit_components

Enable audio visualization when creating the Room:

// Enable audio visualization when creating the Room
final room = Room(roomOptions: const RoomOptions(enableVisualizer: true));

Then use the SoundWaveformWidget to display the agent's audio visualization, assuming you're using a RoomContext:

import 'package:flutter/material.dart';
import 'package:livekit_client/livekit_client.dart';
import 'package:livekit_components/livekit_components.dart' hide ParticipantKind;
import 'package:provider/provider.dart';
/// Shows a simple audio visualizer for an agent participant
class AgentView extends StatelessWidget {
const AgentView({super.key});
Widget build(BuildContext context) {
return Consumer<RoomContext>(
builder: (context, roomContext, child) {
// Find the agent participant in the room
final agentParticipant = roomContext.room.remoteParticipants.values
.where((p) => p.kind == ParticipantKind.AGENT)
.firstOrNull;
if (agentParticipant == null) {
return const SizedBox.shrink();
}
// Get the agent's audio track for visualization
final audioTrack = agentParticipant.audioTrackPublications
.firstOrNull?.track as AudioTrack?;
if (audioTrack == null) {
return const SizedBox.shrink();
}
// Show the waveform visualization
return SoundWaveformWidget(
audioTrack: audioTrack,
options: AudioVisualizerOptions(
width: 32,
minHeight: 32,
maxHeight: 256,
color: Theme.of(context).colorScheme.primary,
count: 7,
),
);
},
);
}
}

See the full API reference for each visualizer variant, including interactive previews and prop documentation.