Expo quickstart

Get started with LiveKit and Expo on React Native

1. Install LiveKit SDK

LiveKit provides a React Native SDK and corresponding Expo config plugin. Install the packages and dependencies with:

npm install @livekit/react-native @livekit/react-native-expo-plugin @livekit/react-native-webrtc @config-plugins/react-native-webrtc
note

The LiveKit SDK is not compatible with the Expo Go app due to the native code required. Using expo-dev-client and building locally will allow you to create development builds compatible with LiveKit.

In your root folder's app.json, add the expo plugins like so:

{
"expo": {
"plugins": ["@livekit/react-native-expo-plugin", "@config-plugins/react-native-webrtc"]
}
}

Finally, in your App.js file, setup the LiveKit SDK by calling registerGlobals(). This sets up the required WebRTC libraries for use in Javascript, and is needed for LiveKit to work.

import { registerGlobals } from '@livekit/react-native';
registerGlobals();

2. Connect to a room, publish video & audio

import * as React from 'react';
import {
StyleSheet,
View,
FlatList,
ListRenderItem,
} from 'react-native';
import { useEffect } from 'react';
import {
AudioSession,
LiveKitRoom,
useTracks,
TrackReferenceOrPlaceholder,
VideoTrack,
isTrackReference,
registerGlobals,
} from '@livekit/react-native';
import { Track } from 'livekit-client';
registerGlobals();
// !! Note !!
// This sample hardcodes a token which expires in 2 hours.
const wsURL = "<your LiveKit server URL>"
const token = "<generate a token>"
export default function App() {
// Start the audio session first.
useEffect(() => {
let start = async () => {
await AudioSession.startAudioSession();
};
start();
return () => {
AudioSession.stopAudioSession();
};
}, []);
return (
<LiveKitRoom
serverUrl={wsURL}
token={token}
connect={true}
options={{
// Use screen pixel density to handle screens with differing densities.
adaptiveStream: { pixelDensity: 'screen' },
}}
audio={true}
video={true}
>
<RoomView />
</LiveKitRoom>
);
};
const RoomView = () => {
// Get all camera tracks.
const tracks = useTracks([Track.Source.Camera]);
const renderTrack: ListRenderItem<TrackReferenceOrPlaceholder> = ({item}) => {
// Render using the VideoTrack component.
if(isTrackReference(item)) {
return (<VideoTrack trackRef={item} style={styles.participantView} />)
} else {
return (<View style={styles.participantView} />)
}
};
return (
<View style={styles.container}>
<FlatList
data={tracks}
renderItem={renderTrack}
/>
</View>
);
};
const styles = StyleSheet.create({
container: {
flex: 1,
alignItems: 'stretch',
justifyContent: 'center',
},
participantView: {
height: 300,
},
});

See the quickstart example repo for a fully configured app using Expo.

3. Create a backend server to generate tokens

Set up a server to generate tokens for your app at runtime by following this guide: Generating Tokens.