Follow the instructions below but instead, install and import expo-livekit instead of @livekit/react-native.
livekit-react-native
Use this SDK to add real-time video, audio and data features to your React Native app. By connecting to a self- or cloud-hosted LiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code.
[!NOTE]
This is v2 of the React-Native SDK. When migrating from v1.x to v2.x you might encounter a small set of breaking changes.
Read the migration guide for a detailed overview of what has changed.
import com.livekit.reactnative.LiveKitReactNative;
import com.livekit.reactnative.audio.AudioType;
publicclassMainApplicationextendsApplicationimplementsReactApplication {
@OverridepublicvoidonCreate() {
// Place this above any other RN related initialization// When AudioType is omitted, it'll default to CommunicationAudioType.// Use MediaAudioType if user is only consuming audio, and not publishing.
LiveKitReactNative.setup(this, newAudioType.CommunicationAudioType());
//...
}
}
Or in your MainApplication.kt if you are using RN 0.73+
Kotlin
import com.livekit.reactnative.LiveKitReactNative
import com.livekit.reactnative.audio.AudioType
classMainApplication : Application, ReactApplication() {
overridefunonCreate() {
// Place this above any other RN related initialization// When AudioType is omitted, it'll default to CommunicationAudioType.// Use MediaAudioType if user is only consuming audio, and not publishing.
LiveKitReactNative.setup(this, AudioType.CommunicationAudioType())
//...
}
}
#import "LivekitReactNative.h"@implementationAppDelegate
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
// Place this above any other RN related initialization
[LivekitReactNative setup];
//...
}
We've included an example app that you can try out.
Usage
In your index.js file, setup the LiveKit SDK by calling registerGlobals().
This sets up the required WebRTC libraries for use in Javascript, and is needed for LiveKit to work.
As seen in the above example, we've introduced a class AudioSession that helps
to manage the audio session on native platforms. This class wraps either AudioManager on Android, or AVAudioSession on iOS.
You can customize the configuration of the audio session with configureAudio.
Android
Media playback
By default, the audio session is set up for bidirectional communication. In this mode, the audio framework exhibits the following behaviors:
The volume cannot be reduced to 0.
Echo cancellation is available and is enabled by default.
A microphone indicator can be displayed, depending on the platform.
If you're leveraging LiveKit primarily for media playback, you have the option to reconfigure the audio session to better suit media playback. Here's how:
Instead of using our presets, you can further customize the audio session to suit your specific needs.
awaitAudioSession.configureAudio({
android: {
preferredOutputList: ['earpiece'],
// See [AudioManager](https://developer.android.com/reference/android/media/AudioManager)// for details on audio and focus modes.audioTypeOptions: {
manageAudioFocus: true,
audioMode: 'normal',
audioFocusMode: 'gain',
audioStreamType: 'music',
audioAttributesUsageType: 'media',
audioAttributesContentType: 'unknown',
},
},
});
awaitAudioSession.startAudioSession();
iOS
For iOS, the most appropriate audio configuration may change over time when local/remote
audio tracks publish and unpublish from the room. To adapt to this, the useIOSAudioManagement
hook is advised over just configuring the audio session once for the entire audio session.
Screenshare
Enabling screenshare requires extra installation steps:
Android
Android screenshare requires a foreground service with type mediaProjection to be present.
It involves copying the files found in this sample project
to your iOS project, and registering a Broadcast Extension in Xcode.
It's also recommended to use CallKeep,
to register a call with CallKit (as well as turning on the voip background mode).
Due to background app processing limitations, screen recording may be interrupted if the app is restricted
in the background. Registering with CallKit allows the app to continue processing for the duration of the call.
Once setup, iOS screenshare can be initiated like so:
We found that expo-livekit demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.It has 1 open source maintainer collaborating on the project.
Package last updated on 18 Apr 2024
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Ransomware payment rates hit an all-time low in 2024 as law enforcement crackdowns, stronger defenses, and shifting policies make attacks riskier and less profitable.