Voice AI quickstart
To build your first voice AI app for SwiftUI, use the following quickstart and the starter app. Otherwise follow the getting started guide below.
Voice AI quickstart
SwiftUI Voice Agent
Getting started guide
This guide uses the Swift Components library for the easiest way to get started on iOS.
LiveKit also supports macOS, tvOS, and visionOS. More documentation for the core Swift SDK is on GitHub.
Otherwise follow this guide to build your first LiveKit app with SwiftUI.
SDK installation
let package = Package(...dependencies: [.package(url: "https://github.com/livekit/client-sdk-swift.git", from: "2.5.0"), // Core SDK.package(url: "https://github.com/livekit/components-swift.git", from: "0.1.0"), // UI Components],targets: [.target(name: "MyApp",dependencies: [.product(name: "LiveKitComponents", package: "components-swift"),])])
Permissions and entitlements
You must add privacy strings for both camera and microphone usage to your Info.plist
file, even if you don't plan to use both in your app.
<dict>...<key>NSCameraUsageDescription</key><string>$(PRODUCT_NAME) uses your camera</string><key>NSMicrophoneUsageDescription</key><string>$(PRODUCT_NAME) uses your microphone</string>...</dict>
To continue audio sessions in the background add the Audio, AirPlay, and Picture in Picture background mode to the Capabilities tab of your app target in Xcode.
Your Info.plist
should have the following entries:
<dict>...<key>UIBackgroundModes</key><array><string>audio</string></array>
Connecting to LiveKit
This simple example uses a hardcoded token that expires in 2 hours. In a real app, you’ll need to generate a token with your server.
// !! Note !!// This sample hardcodes a token which expires in 2 hours.let wsURL = "<your LiveKit server URL>"let token = "<generate a token>"// In production you should generate tokens on your server, and your client// should request a token from your server.@preconcurrency import LiveKitimport LiveKitComponentsimport SwiftUIstruct ContentView: View {@StateObject private var room: Roominit() {let room = Room()_room = StateObject(wrappedValue: room)}var body: some View {Group {if room.connectionState == .disconnected {Button("Connect") {Task {do {try await room.connect(url: wsURL,token: token,connectOptions: ConnectOptions(enableMicrophone: true))try await room.localParticipant.setCamera(enabled: true)} catch {print("Failed to connect to LiveKit: \(error)")}}}} else {LazyVStack {ForEachParticipant { _ inVStack {ForEachTrack(filter: .video) { trackReference inVideoTrackView(trackReference: trackReference).frame(width: 500, height: 500)}}}}}}.padding().environmentObject(room)}}
For more details, you can reference the components example app.
Next steps
The following resources are useful for getting started with LiveKit on iOS.
Generating tokens
Guide to generating authentication tokens for your users.
Realtime media
Complete documentation for live video and audio tracks.
Realtime data
Send and receive realtime data between clients.
Swift SDK
LiveKit Swift SDK on GitHub.
SwiftUI Components
LiveKit SwiftUI Components on GitHub.
Swift SDK reference
LiveKit Swift SDK reference docs.
SwiftUI components reference
LiveKit SwiftUI components reference docs.