
100ms React Native SDK
Integrate Real Time Audio and Video conferencing, Interactive Live Streaming, and Chat in your apps with 100ms React Native SDK.
With support for HLS and RTMP Live Streaming and Recording, Picture-in-Picture (PiP), one-to-one Video Call Modes, Audio Rooms, Video Player and much more, add immersive real-time communications to your apps.
๐ Read the Complete Documentation here: https://www.100ms.live/docs/react-native/v2/foundation/basics
| @100mslive/react-native-room-kit |  |
| @100mslive/react-native-hms |  |
| @100mslive/react-native-video-plugin |  |
๐ Example App
๐ฒ Download the Example iOS app here: https://testflight.apple.com/join/v4bSIPad
๐ค Download the Example Android app here: https://appdistribution.firebase.dev/i/7b7ab3b30e627c35
To get a better understanding of how the example app is structured, what to do on onJoin, onTrack and onPeer listeners, creating PeerTrackNodes, how to use Redux, and what type of layouts and sorting you can implement in your app, checkout Example App's README
To run the Example app on your system, follow these steps -
- In the project root, run
npm install
- Go to the example folder,
cd example
- In the example folder, run
npm install
- To run on Android, run
npx react-native run-android
- To run on iOS, first install the pods in iOS folder,
cd ios; pod install. Then, set the Correct Development Team in Xcode Signing & Capabilities section. Then, in example folder, run npx react-native run-ios
Troubleshooting Guide for resolving issues in running the Example app is available here.
โ๏ธ Minimum Configuration
- Support for React Native 0.73.0 or above
- Support for Java 17 or above
- Support for Android API level 24 or above
- Xcode 14 or above
- Support for iOS 16 or above
- Node.js 22 or above
๐ค Recommended Configuration
- React Native 0.77.3 or above
- Java 17 or above
- Android API level 35 or above
- Xcode 15 or above
- iOS 16 or above
- Node.js 22 or above
๐ฑ Supported Devices
-
The Android SDK supports Android API level 24 (Android 7.0) and higher. It is built for arm64-v8a and x86_64 architectures (64-bit only).
Devices running Android OS 11 or above is recommended.
-
iPhone & iPads with iOS version 16 or above are supported.
Installation
npm install @100mslive/react-native-hms --save
๐ฒ Download the Sample iOS App here: https://testflight.apple.com/join/v4bSIPad
๐ค Download the Sample Android App here: https://appdistribution.firebase.dev/i/7b7ab3b30e627c35
More information about Integrating the SDK is available here.
๐ Permissions
๐ฑ For iOS Permissions
Add following lines in Info.plist file
<key>NSCameraUsageDescription</key>
<string>Please allow access to Camera to enable video conferencing</string>
<key>NSMicrophoneUsageDescription</key>
<string>Please allow access to Microphone to enable video conferencing</string>
<key>NSLocalNetworkUsageDescription</key>
<string>Please allow access to network usage to enable video conferencing</string>
๐ค For Android Permissions
Add following permissions in AndroidManifest.xml
<uses-feature android:name="android.hardware.camera.autofocus"/>
<uses-permission android:name="android.permission.CAMERA"/>
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE"/>
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<uses-permission android:name="android.permission.BLUETOOTH" android:maxSdkVersion="30" />
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
You will also need to request Camera and Record Audio permissions at runtime before you join a call or display a preview. Please follow Android Documentation for runtime permissions.
We suggest using react-native-permission to acquire permissions from both platforms.
More information about Audio Video Permission on iOS & Android is available here.
The package exports all the classes and a HMSSDK class that manages everything.
First invoke the build method which returns an instance of HMSSDK. Save this instance to perform all actions related to 100ms.
import { HMSSDK } from '@100mslive/react-native-hms';
const hmsInstance = await HMSSDK.build();
Fetch token using room-code method (Recommended)
We can get the authentication token using room-code from meeting URL.
Let's understand the subdomain and code from the sample URL
In this sample url: http://100ms-rocks.app.100ms.live/meeting/abc-defg-hij
- Subdomain is
100ms-rocks
- Room code is
abc-defg-hij
Now to get the room-code from meeting URL, we can write our own logic or use the getCode method from here
To generate token we will be using getAuthTokenByRoomCode method available on HMSSDK instance. This method has roomCode as a required
parameter and userId & endpoint as optional parameter.
Let's checkout the implementation:
const token = await hmsInstance.getAuthTokenByRoomCode('YOUR_ROOM_CODE');
const hmsConfig = new HMSConfig({
authToken: token,
username: 'John Appleseed',
});
Get temporary token from dashboard
To test audio/video functionality, you need to connect to a 100ms Room. Please check the following steps for the same -
- Navigate to your 100ms dashboard or create an account if you don't have one.
- Use the
Video Conferencing Starter Kit to create a room with a default template assigned to it to test this app quickly.
- Go to the Rooms page in your dashboard, click on the
Room Id of the room you created above, and click on the Join Room button on the top right.
- You will see 100ms demo URLs for the roles created when you deployed the starter kit; you can click on the 'key' icon to copy the token and update the
AUTH_TOKEN variable in "App.js" file.
Token from 100ms dashboard is for testing purposes only, For production applications you must generate tokens on your own server. Refer to the Management Token section in Authentication and Tokens guide for more information.
Add Event Listeners to get notified about actions happening in the 100ms Room.
The most commonly used Events are onJoin, onPeerUpdate & onTrackUpdate. All the available actions can be found in the HMSUpdateListenerActions class.
The Event Listeners are to be used for handling any update happening in 100ms Room.
const hmsInstance = await HMSSDK.build();
hmsInstance.addEventListener(
HMSUpdateListenerActions.ON_JOIN,
joinSuccess
);
The detailed QuickStart Guide is available here.
To interact with peers in audio or video call, the user needs to Join a Room.
When user indicates that they want to join the room, your app should have -
-
User Name - The name which should be displayed to other peers in the room.
-
Authentication Token - The Client side Authentication Token generated by the Token Service. Details about how to create Auth Tokens are available here.
Additionally, you can also pass these fields while Joining a Room -
-
Track Settings - Such as joining a Room with Muted Audio or Video using the HMSTrackSettings object. More information is available here.
-
Peer Metadata - This can be used to pass any additional metadata associated with the user using metadata of HMSConfig object. For Eg: user-id mapping at the application side. More information is available here.
NOTE: ON_JOIN Event Listener must be attached before calling join function to receive the event callback.
const hmsInstance = await HMSSDK.build();
const token = await hmsInstance.getAuthTokenByRoomCode('abc-defg-hij');
hmsInstance.addEventListener(HMSUpdateListenerActions.ON_JOIN, onJoinSuccess);
hmsInstance.addEventListener(
HMSUpdateListenerActions.ON_PEER_UPDATE,
onPeerUpdate
);
hmsInstance.addEventListener(
HMSUpdateListenerActions.ON_TRACK_UPDATE,
onTrackUpdate
);
hmsInstance.addEventListener(HMSUpdateListenerActions.ON_ERROR, onError);
let config = new HMSConfig({
authToken: token,
username: 'John Appleseed',
});
hmsInstance.join(config);
More information about Joining a Room is available here.
Basic Mechanism of using 100ms APIs
For invoking any actions simply use the HMSSDK instance created in above steps. Few common examples of using it are as follows -
hmsInstance?.localPeer?.localAudioTrack()?.setMute(true);
hmsInstance?.localPeer?.localVideoTrack()?.setMute(true);
hmsInstance?.localPeer?.localVideoTrack()?.switchCamera();
await hmsInstance?.leave();
await hmsInstance?.sendBroadcastMessage('Hello Everyone! ๐');
More information about using HMSSDK APIs is available here.
It all comes down to this. All the setup so far has been done so that we can show Live Streaming Video in our beautiful apps.
100ms React Native SDK provides HmsView component that renders the video on the screen. You can access HmsView from the HMSSDK instance created in above steps.
We simply have to pass a Video Track's trackId to the HmsView to begin automatic rendering of Live Video Stream.
We can also optionally pass props like key, scaleType, mirror to customize the HmsView component.
const HmsView = hmsInstance.HmsView;
<HmsView trackId={videoTrackId} key={videoTrackId} style={styles.hmsView} />;
const styles = StyleSheet.create({
hmsView: {
height: '100%',
width: '100%',
},
});
-
One HmsView component can only be connected with one video trackId. To display multiple videos you have to create multiple instances of HmsView component.
-
It's recommended to always pass the key property while creating HmsView. If a null or undefined trackId is passed in HmsView you will have to unmount and remount with the new trackId. Using the key prop and passing trackId to it automatically achieves this.
-
HmsView component requires width and height in style prop to set bounds of the tile that will show the video stream.
-
Once the requirement of that HmsView is finished it should ALWAYS be disposed.
-
Recommended practice is to show maximum of 3 to 4 HmsView on a single page/screen of the app. This avoids overloading network data consumption & video decoding resources of the device.
More information about Rendering Videos is available here.
Always use ON_PEER_UPDATE and ON_TRACK_UPDATE listeners, these listeners get updated localPeer and remotePeers whenever there is any event related to these values.
The following code snippet shows a simple example of attaching Track Updates Event Listener & using it to show a Video.
const [trackIds, setTrackIds] = useState<string[]>([]);
const onTrackListener = (data: { peer: HMSPeer; track: HMSTrack; type: HMSTrackUpdate }) => {
if (data.track.type !== HMSTrackType.VIDEO) return;
if (data.type === HMSTrackUpdate.TRACK_ADDED) setTrackIds(prevTrackIds => [...prevTrackIds, data.track.trackId]);
if (data.type === HMSTrackUpdate.TRACK_REMOVED) setTrackIds(prevTrackIds => prevTrackIds.filter(prevTrackId => prevTrackId !== data.track.trackId));
};
hmsInstance.addEventListener(
HMSUpdateListenerActions.ON_TRACK_UPDATE,
onTrackListener
);
<FlatList
data={trackIds} // trackIds is an array of trackIds of video tracks
keyExtractor={(trackId) => trackId}
renderItem={({ item }) => <HMSView key={item} trackId={item} style={{ width: '100%', height: 300 }} {...} />}
{...}
/>
More information about Rendering Videos is available here.
In the 100ms Example App we have shown how to set up the various listeners, what data to store in Redux and what all features you can implement.
We have also implemented multiple views which are commonly used. Checkout the videos & relevant code in the Example app.
๐ Read the Complete Documentation here: https://www.100ms.live/docs/react-native/v2/foundation/basics