Amazon Chime SDK for JavaScript
Build video calling, audio calling, and screen sharing applications powered by Amazon Chime.
The Amazon Chime SDK makes it easy to add collaborative audio calling,
video calling, and screen share features to web applications by using
the same infrastructure services that power millions of Amazon Chime
online meetings.
This Amazon Chime SDK for JavaScript works by connecting to meeting session
resources that you have created in your AWS account. The SDK has everything
you need to build custom calling and collaboration experiences in your
web application, including methods to: configure meeting sessions, list and
select audio and video devices, start and stop screen share and screen share
viewing, receive callbacks when media events occur such as volume changes, and
control meeting features such as audio mute and video tile bindings.
To get started, see the following resources:
And review the following guides:
Examples
- Meeting Demo - A browser
meeting application with a local server
- Serverless Meeting Demo - A self-contained serverless meeting application
- Video Help Desk - A tutorial that shows how to build a website widget that allows a customer to make a video call to a help desk
- Single JS - A script to bundle the SDK into a single
.js
file - Recording Demo - Recording the meeting's audio, video and screen share in high definition
- Virtual Classroom - An online classroom built with Electron and React
Installation
Make sure you have Node.js version 10 and higher.
To add the Amazon Chime SDK for JavaScript into an existing application,
install the package directly from npm:
npm install amazon-chime-sdk-js --save
Setup
Create a meeting session in your client application.
import {
ConsoleLogger,
DefaultDeviceController,
DefaultMeetingSession,
LogLevel,
MeetingSessionConfiguration
} from 'amazon-chime-sdk-js';
const logger = new ConsoleLogger('MyLogger', LogLevel.INFO);
const deviceController = new DefaultDeviceController(logger);
const meetingResponse = ;
const attendeeResponse = ;
const configuration = new MeetingSessionConfiguration(meetingResponse, attendeeResponse);
const meetingSession = new DefaultMeetingSession(
configuration,
logger,
deviceController
);
Getting responses from your server application
You can use an AWS SDK, the AWS Command Line Interface (AWS CLI), or the REST API
to make API calls. In this section, you will use the AWS SDK for JavaScript in your server application, e.g. Node.js.
See Amazon Chime SDK API Reference for more information.
⚠️ The server application does not require the Amazon Chime SDK for JavaScript.
const AWS = require('aws-sdk');
const { v4: uuid } = require('uuid');
const chime = new AWS.Chime({ region: 'us-east-1' });
chime.endpoint = new AWS.Endpoint('https://service.chime.aws.amazon.com/console');
const meetingResponse = await chime.createMeeting({
ClientRequestToken: uuid(),
MediaRegion: 'us-west-2'
}).promise();
const attendeeResponse = await chime.createAttendee({
MeetingId: meetingResponse.Meeting.MeetingId,
ExternalUserId: uuid()
}).promise();
Now securely transfer the meetingResponse
and attendeeResponse
objects to your client application.
These objects contain all the information needed for a client application using the Amazon Chime SDK for JavaScript to join the meeting.
Building and testing
npm run build
npm run test
After running npm run test
the first time, you can use npm run test:fast
to
speed up the test suite.
To view code coverage results open build/coverage/index.html
in your browser
after running npm run test
.
Generating the documentation
To generate JavaScript API reference documentation run:
npm run build
npm run doc
Then open docs/index.html
in your browser.
Reporting a suspected vulnerability
If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our
vulnerability reporting page.
Please do not create a public GitHub issue.
Usage
Device
Note: Before starting a session, you need to choose your microphone, speaker, and camera.
Use case 1. List audio input, audio output, and video input devices. The browser will ask for microphone and camera permissions.
const audioInputDevices = await meetingSession.audioVideo.listAudioInputDevices();
const audioOutputDevices = await meetingSession.audioVideo.listAudioOutputDevices();
const videoInputDevices = await meetingSession.audioVideo.listVideoInputDevices();
audioInputDevices.forEach(mediaDeviceInfo => {
console.log(`Device ID: ${mediaDeviceInfo.deviceId} Microphone: ${mediaDeviceInfo.label}`);
});
Use case 2. Choose audio input and audio output devices by passing the deviceId
of a MediaDeviceInfo
object.
const audioInputDeviceInfo = ;
await meetingSession.audioVideo.chooseAudioInputDevice(audioInputDeviceInfo.deviceId);
const audioOutputDeviceInfo = ;
await meetingSession.audioVideo.chooseAudioOutputDevice(audioOutputDeviceInfo.deviceId);
Use case 3. Choose a video input device by passing the deviceId
of a MediaDeviceInfo
object.
If there is an LED light next to the attendee's camera, it will be turned on indicating that it is now capturing from the camera.
You probably want to choose a video input device when you start sharing your video.
const videoInputDeviceInfo = ;
await meetingSession.audioVideo.chooseVideoInputDevice(videoInputDeviceInfo.deviceId);
await meetingSession.audioVideo.chooseVideoInputDevice(null);
Use case 4. Add a device change observer to receive the updated device list.
For example, when you pair Bluetooth headsets with your computer, audioInputsChanged
and audioOutputsChanged
are called
with the device list including headsets.
const observer = {
audioInputsChanged: freshAudioInputDeviceList => {
freshAudioInputDeviceList.forEach(mediaDeviceInfo => {
console.log(`Device ID: ${mediaDeviceInfo.deviceId} Microphone: ${mediaDeviceInfo.label}`);
});
},
audioOutputsChanged: freshAudioOutputDeviceList => {
console.log('Audio outputs updated: ', freshAudioOutputDeviceList);
},
videoInputsChanged: freshVideoInputDeviceList => {
console.log('Video inputs updated: ', freshVideoInputDeviceList);
}
};
meetingSession.audioVideo.addDeviceChangeObserver(observer);
Starting a session
Use case 5. Start a session. To hear audio, you need to bind a device and stream to an <audio>
element.
Once the session has started, you can talk and listen to attendees.
Make sure you have chosen your microphone and speaker (See the "Device" section), and at least one other attendee has joined the session.
const audioElement = ;
meetingSession.audioVideo.bindAudioElement(audioElement);
const observer = {
audioVideoDidStart: () => {
console.log('Started');
}
};
meetingSession.audioVideo.addObserver(observer);
meetingSession.audioVideo.start();
Use case 6. Add an observer to receive session lifecycle events: connecting, start, and stop.
Note: You can remove an observer by calling meetingSession.audioVideo.removeObserver(observer)
.
In a component-based architecture (such as React, Vue, and Angular), you may need to add an observer
when a component is mounted, and remove it when unmounted.
const observer = {
audioVideoDidStart: () => {
console.log('Started');
},
audioVideoDidStop: sessionStatus => {
console.log('Stopped with a session status code: ', sessionStatus.statusCode());
},
audioVideoDidStartConnecting: reconnecting => {
if (reconnecting) {
console.log('Attempting to reconnect');
}
}
};
meetingSession.audioVideo.addObserver(observer);
Audio
Note: So far, you've added observers to receive device and session lifecycle events.
In the following use cases, you'll use the real-time API methods to send and receive volume indicators and control mute state.
Use case 7. Mute and unmute an audio input.
meetingSession.audioVideo.realtimeMuteLocalAudio();
const unmuted = meetingSession.audioVideo.realtimeUnmuteLocalAudio();
if (unmuted) {
console.log('Other attendees can hear your audio');
} else {
console.log('You cannot unmute yourself');
}
Use case 8. To check whether the local microphone is muted, use this method rather than keeping track of your own mute state.
const muted = meetingSession.audioVideo.realtimeIsLocalAudioMuted();
if (muted) {
console.log('You are muted');
} else {
console.log('Other attendees can hear your audio');
}
Use case 9. Disable unmute. If you want to prevent users from unmuting themselves (for example during a presentation), use these methods rather than keeping track of your own can-unmute state.
meetingSession.audioVideo.realtimeSetCanUnmuteLocalAudio(false);
meetingSession.audioVideo.realtimeMuteLocalAudio();
const unmuted = meetingSession.audioVideo.realtimeUnmuteLocalAudio();
console.log(`${unmuted} is false. You cannot unmute yourself`)
Use case 10. Subscribe to volume changes of a specific attendee. You can use this to build a real-time volume indicator UI.
import { DefaultModality } from 'amazon-chime-sdk-js';
const presentAttendeeId = meetingSession.configuration.credentials.attendeeId;
meetingSession.audioVideo.realtimeSubscribeToVolumeIndicator(
presentAttendeeId,
(attendeeId, volume, muted, signalStrength) => {
const baseAttendeeId = new DefaultModality(attendeeId).base();
if (baseAttendeeId !== attendeeId) {
console.log(`The volume of ${baseAttendeeId}'s content changes`);
}
console.log(`${attendeeId}'s volume data: `, {
volume,
muted,
signalStrength
});
}
);
Use case 11. Detect the most active speaker. For example, you can enlarge the active speaker's video element if available.
import { DefaultActiveSpeakerPolicy } from 'amazon-chime-sdk-js';
const activeSpeakerCallback = attendeeIds => {
if (attendeeIds.length) {
console.log(`${attendeeIds[0]} is the most active speaker`);
}
};
meetingSession.audioVideo.subscribeToActiveSpeakerDetector(
new DefaultActiveSpeakerPolicy(),
activeSpeakerCallback
);
Video
Note: In Chime SDK terms, a video tile is an object containing an attendee ID,
a video stream, etc. To view a video in your application, you must bind a tile to a <video>
element.
- Make sure you bind a tile to the same video element until the tile is removed.
- A tile is created with a new tile ID when the same attendee restarts the video.
Use case 12. Start sharing your video. The local video element is flipped horizontally (mirrored mode).
const videoElement = ;
const videoInputDevices = await meetingSession.audioVideo.listVideoInputDevices();
await meetingSession.audioVideo.chooseVideoInputDevice(videoInputDevices[0].deviceId);
const observer = {
videoTileDidUpdate: tileState => {
if (!tileState.boundAttendeeId || !tileState.localTile) {
return;
}
meetingSession.audioVideo.bindVideoElement(tileState.tileId, videoElement);
}
};
meetingSession.audioVideo.addObserver(observer);
meetingSession.audioVideo.startLocalVideoTile();
Use case 13. Stop sharing your video.
const videoElement = ;
let localTileId = null;
const observer = {
videoTileDidUpdate: tileState => {
if (!tileState.boundAttendeeId || !tileState.localTile) {
return;
}
console.log(`If you called stopLocalVideoTile, ${tileState.active} is false.`);
meetingSession.audioVideo.bindVideoElement(tileState.tileId, videoElement);
localTileId = tileState.tileId;
},
videoTileWasRemoved: tileId => {
if (localTileId === tileId) {
console.log(`You called removeLocalVideoTile. videoElement can be bound to another tile.`);
localTileId = null;
}
}
};
meetingSession.audioVideo.addObserver(observer);
meetingSession.audioVideo.stopLocalVideoTile();
meetingSession.audioVideo.removeLocalVideoTile();
Use case 14. View one attendee video, e.g. in a 1-on-1 session.
const videoElement = ;
const observer = {
videoTileDidUpdate: tileState => {
if (!tileState.boundAttendeeId || tileState.localTile || tileState.isContent) {
return;
}
meetingSession.audioVideo.bindVideoElement(tileState.tileId, videoElement);
}
};
meetingSession.audioVideo.addObserver(observer);
Use case 15. View up to 16 attendee videos. Assume that you have 16 video elements in your application,
and that an empty cell means it's taken.
const videoElements = [];
const indexMap = {};
const acquireVideoElement = tileId => {
for (let i = 0; i < 16; i += 1) {
if (indexMap[i] === tileId) {
return videoElements[i];
}
}
for (let i = 0; i < 16; i += 1) {
if (!indexMap.hasOwnProperty(i)) {
indexMap[i] = tileId;
return videoElements[i];
}
}
throw new Error('no video element is available');
};
const releaseVideoElement = tileId => {
for (let i = 0; i < 16; i += 1) {
if (indexMap[i] === tileId) {
delete indexMap[i];
return;
}
}
};
const observer = {
videoTileDidUpdate: tileState => {
if (!tileState.boundAttendeeId || tileState.localTile || tileState.isContent) {
return;
}
meetingSession.audioVideo.bindVideoElement(
tileState.tileId,
acquireVideoElement(tileState.tileId)
);
},
videoTileWasRemoved: tileId => {
releaseVideoElement(tileId);
}
};
meetingSession.audioVideo.addObserver(observer);
Screen and content share
Note: When you or other attendees share content (a screen capture, a video file, or any other MediaStream object),
the content attendee (attendee-id#content) joins the session and shares content as if a regular attendee shares a video.
For example, your attendee ID is "my-id". When you call meetingSession.audioVideo.startContentShare
,
the content attendee "my-id#content" will join the session and share your content.
Use case 16. Start sharing your screen.
import { DefaultModality } from 'amazon-chime-sdk-js';
const observer = {
videoTileDidUpdate: tileState => {
if (!tileState.boundAttendeeId || !tileState.isContent) {
return;
}
const yourAttendeeId = meetingSession.configuration.credentials.attendeeId;
const boundAttendeeId = tileState.boundAttendeeId;
const baseAttendeeId = new DefaultModality(boundAttendeeId).base();
if (baseAttendeeId === yourAttendeeId) {
console.log('You called startContentShareFromScreenCapture');
}
},
contentShareDidStart: () => {
console.log('Screen share started');
},
contentShareDidStop: () => {
console.log('Screen share stopped');
}
};
meetingSession.audioVideo.addContentShareObserver(observer);
meetingSession.audioVideo.addObserver(observer);
await meetingSession.audioVideo.startContentShareFromScreenCapture();
Use case 17. Start sharing your screen in an environment that does not support a screen picker dialog. e.g. Electron
const sourceId = ;
await meetingSession.audioVideo.startContentShareFromScreenCapture(sourceId);
Use case 18. Start streaming your video file from an <input>
element of type file
.
const videoElement = ;
const inputElement = ;
inputElement.addEventListener('change', async () => {
const file = inputElement.files[0];
const url = URL.createObjectURL(file);
videoElement.src = url;
await videoElement.play();
const mediaSream = videoElement.captureStream();
await meetingSession.audioVideo.startContentShare(mediaSream);
inputElement.value = '';
});
Use case 19. Stop sharing your screen or content.
const observer = {
contentShareDidStop: () => {
console.log('Content share stopped');
}
};
meetingSession.audioVideo.addContentShareObserver(observer);
await meetingSession.audioVideo.stopContentShare();
Use case 20. View up to 2 attendee content or screens. Chime SDK allows 2 simultaneous content shares per meeting.
import { DefaultModality } from 'amazon-chime-sdk-js';
const videoElementStack = [];
const tileMap = {};
const observer = {
videoTileDidUpdate: tileState => {
if (!tileState.boundAttendeeId || !tileState.isContent) {
return;
}
const yourAttendeeId = meetingSession.configuration.credentials.attendeeId;
const boundAttendeeId = tileState.boundAttendeeId;
const baseAttendeeId = new DefaultModality(boundAttendeeId).base();
if (baseAttendeeId !== yourAttendeeId) {
console.log(`${baseAttendeeId} is sharing screen now`);
const videoElement = tileMap[tileState.tileId] || videoElementStack.pop();
if (videoElement) {
tileMap[tileState.tileId] = videoElement;
meetingSession.audioVideo.bindVideoElement(tileState.tileId, videoElement);
} else {
console.log('No video element is available');
}
}
},
videoTileWasRemoved: tileId => {
const videoElement = tileMap[tileId];
if (videoElement) {
videoElementStack.push(videoElement);
delete tileMap[tileId];
}
}
};
meetingSession.audioVideo.addObserver(observer);
Attendees
Use case 21. Subscribe to attendee presence changes. When an attendee joins or leaves a session,
the callback receives presentAttendeeId
and present
(a boolean).
const attendeePresenceSet = new Set();
const callback = (presentAttendeeId, present) => {
console.log(`Attendee ID: ${presentAttendeeId} Present: ${present}`);
if (present) {
attendeePresenceSet.add(presentAttendeeId);
} else {
attendeePresenceSet.delete(presentAttendeeId);
}
};
meetingSession.audioVideo.realtimeSubscribeToAttendeeIdPresence(callback);
Use case 22. Create a simple roster by subscribing to attendee presence and volume changes.
import { DefaultModality } from 'amazon-chime-sdk-js';
const roster = {};
meetingSession.audioVideo.realtimeSubscribeToAttendeeIdPresence(
(presentAttendeeId, present) => {
if (!present) {
delete roster[presentAttendeeId];
return;
}
meetingSession.audioVideo.realtimeSubscribeToVolumeIndicator(
presentAttendeeId,
(attendeeId, volume, muted, signalStrength) => {
const baseAttendeeId = new DefaultModality(attendeeId).base();
if (baseAttendeeId !== attendeeId) {
return;
}
if (roster.hasOwnProperty(attendeeId)) {
roster[attendeeId].volume = volume;
roster[attendeeId].muted = muted;
roster[attendeeId].signalStrength = signalStrength;
} else {
roster[attendeeId] = {
attendeeId,
volume,
muted,
signalStrength
};
}
}
);
}
);
Monitoring and alerts
Use case 23. Add an observer to receive video metrics. See AudioVideoObserver
for more available metrics,
such as WebRTC statistics processed by Chime SDK.
const observer = {
videoSendHealthDidChange: (bitrateKbps, packetsPerSecond) => {
console.log(`Sending bitrate in kilobits per second: ${bitrateKbps} and ${packetsPerSecond}`);
},
videoSendBandwidthDidChange: (newBandwidthKbps, oldBandwidthKbps) => {
console.log(`Sending bandwidth changed from ${oldBandwidthKbps} to ${newBandwidthKbps}`);
},
videoReceiveBandwidthDidChange: (newBandwidthKbps, oldBandwidthKbps) => {
console.log(`Receiving bandwidth changed from ${oldBandwidthKbps} to ${newBandwidthKbps}`);
}
};
meetingSession.audioVideo.addObserver(observer);
Use case 24. Add an observer to receive alerts. You can use these alerts to notify users of connection problems.
const observer = {
connectionDidBecomePoor: () => {
console.log('Your connection is poor');
},
connectionDidSuggestStopVideo: () => {
console.log('Recommend turning off your video');
},
videoSendDidBecomeUnavailable: () => {
console.log('You cannot share your video');
},
videoAvailabilityDidChange: videoAvailability => {
if (videoAvailability.canStartLocalVideo) {
console.log('You can share your video');
} else {
console.log('You cannot share your video');
}
}
};
meetingSession.audioVideo.addObserver(observer);
Stopping a session
Use case 25. Leave a session.
import { MeetingSessionStatusCode } from 'amazon-chime-sdk-js';
const observer = {
audioVideoDidStop: sessionStatus => {
const sessionStatusCode = sessionStatus.statusCode();
if (sessionStatusCode === MeetingSessionStatusCode.Left) {
console.log('You left the session');
} else {
console.log('Stopped with a session status code: ', sessionStatusCode);
}
}
};
meetingSession.audioVideo.addObserver(observer);
meetingSession.audioVideo.stop();
Use case 26. Add an observer to get notified when a session has ended.
import { MeetingSessionStatusCode } from 'amazon-chime-sdk-js';
const observer = {
audioVideoDidStop: sessionStatus => {
const sessionStatusCode = sessionStatus.statusCode();
if (sessionStatusCode === MeetingSessionStatusCode.AudioCallEnded) {
console.log('The session has ended');
} else {
console.log('Stopped with a session status code: ', sessionStatusCode);
}
}
};
meetingSession.audioVideo.addObserver(observer);
Copyright 2019-2020 Amazon.com, Inc. or its affiliates. All Rights Reserved.