Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@q42/lib-jitsi-meet

Package Overview
Dependencies
Maintainers
17
Versions
25
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@q42/lib-jitsi-meet - npm Package Compare versions

Comparing version 2.0.4289 to 2.0.5176

.github/workflows/ci.yml

610

doc/API.md

@@ -11,7 +11,7 @@ Jitsi Meet API

```javascript
```html
<script src="https://meet.jit.si/libs/lib-jitsi-meet.min.js"></script>
```
Now you can access Jitsi Meet API trough the ```JitsiMeetJS``` global object.
Now you can access Jitsi Meet API trough the `JitsiMeetJS` global object.

@@ -37,13 +37,8 @@ Components

----------
You can access the following methods and objects trough ```JitsiMeetJS``` object.
You can access the following methods and objects trough `JitsiMeetJS` object.
* ```JitsiMeetJS.init(options)``` - this method initialized Jitsi Meet API.
The ```options``` parameter is JS object with the following properties:
* `JitsiMeetJS.init(options)` - this method initialized Jitsi Meet API.
The `options` parameter is JS object with the following properties:
- `useIPv6` - boolean property
- `desktopSharingChromeExtId` - The ID of the jidesha extension for Chrome. Example: 'mbocklcggfhnbahlnepmldehdhpjfcjp'
- `desktopSharingChromeDisabled` - Boolean. Whether desktop sharing should be disabled on Chrome. Example: false.
- `desktopSharingChromeSources` - Array of strings with the media sources to use when using screen sharing with the Chrome extension. Example: ['screen', 'window']
- `desktopSharingChromeMinExtVersion` - Required version of Chrome extension. Example: '0.1'
- `desktopSharingFirefoxDisabled` - Boolean. Whether desktop sharing should be disabled on Firefox. Example: false.
- `disableAudioLevels` - boolean property. Enables/disables audio levels.

@@ -54,217 +49,263 @@ - `disableSimulcast` - boolean property. Enables/disables simulcast.

- `enableAnalyticsLogging` - boolean property (default false). Enables/disables analytics logging.
- `externalStorage` - Object that implements the Storage interface. If specified this object will be used for storing data instead of `localStorage`.
- `callStatsCustomScriptUrl` - (optional) custom url to access callstats client script
- `callStatsConfIDNamespace` - (optional) a namespace to prepend the callstats conference ID with. Defaults to the window.location.hostname
- `disableRtx` - (optional) boolean property (default to false). Enables/disable the use of RTX.
- `disableH264` - (optional) boolean property (default to false). If enabled, strips the H.264 codec from the local SDP.
- `preferH264` - (optional) boolean property (default to false). Enables/disable preferring the first instance of an h264 codec in an offer by moving it to the front of the codec list.
- `disabledCodec` - the mime type of the code that should not be negotiated on the peerconnection.
- `preferredCodec` the mime type of the codec that needs to be made the preferred codec for the connection.
- `disableH264` - __DEPRECATED__. Use `disabledCodec` instead.
- `preferH264` - __DEPRECATED__. Use `preferredCodec` instead.
* ```JitsiMeetJS.JitsiConnection``` - the ```JitsiConnection``` constructor. You can use that to create new server connection.
* `JitsiMeetJS.JitsiConnection` - the `JitsiConnection` constructor. You can use that to create new server connection.
* ```JitsiMeetJS.setLogLevel``` - changes the log level for the library. For example to have only error messages you should do:
```
* `JitsiMeetJS.setLogLevel` - changes the log level for the library. For example to have only error messages you should do:
```javascript
JitsiMeetJS.setLogLevel(JitsiMeetJS.logLevels.ERROR);
```
* ```JitsiMeetJS.createLocalTracks(options, firePermissionPromptIsShownEvent)``` - Creates the media tracks and returns them trough ```Promise``` object. If rejected, passes ```JitsiTrackError``` instance to catch block.
- options - JS object with configuration options for the local media tracks. You can change the following properties there:
1. devices - array with the devices - "desktop", "video" and "audio" that will be passed to GUM. If that property is not set GUM will try to get all available devices.
2. resolution - the prefered resolution for the local video.
3. constraints - the prefered encoding properties for the created track (replaces 'resolution' in newer releases of browsers)
4. cameraDeviceId - the deviceID for the video device that is going to be used
5. micDeviceId - the deviceID for the audio device that is going to be used
6. minFps - the minimum frame rate for the video stream (passed to GUM)
7. maxFps - the maximum frame rate for the video stream (passed to GUM)
8. facingMode - facing mode for a camera (possible values - 'user', 'environment')
- firePermissionPromptIsShownEvent - optional boolean parameter. If set to ```true```, ```JitsiMediaDevicesEvents.PERMISSION_PROMPT_IS_SHOWN``` will be fired when browser shows gUM permission prompt.
* `JitsiMeetJS.createLocalTracks(options, firePermissionPromptIsShownEvent)` - Creates the media tracks and returns them trough `Promise` object. If rejected, passes `JitsiTrackError` instance to catch block.
- `options` - JS object with configuration options for the local media tracks. You can change the following properties there:
1. `devices` - array with the devices - "desktop", "video" and "audio" that will be passed to GUM. If that property is not set GUM will try to get all available devices.
2. `resolution` - the prefered resolution for the local video.
3. `constraints` - the prefered encoding properties for the created track (replaces 'resolution' in newer releases of browsers)
4. `cameraDeviceId` - the deviceID for the video device that is going to be used
5. `micDeviceId` - the deviceID for the audio device that is going to be used
6. `minFps` - the minimum frame rate for the video stream (passed to GUM)
7. `maxFps` - the maximum frame rate for the video stream (passed to GUM)
8. `desktopSharingFrameRate`
- `min` - Minimum fps
- `max` - Maximum fps
9. `desktopSharingSourceDevice` - The device id or label for a video input source that should be used for screensharing.
10. `facingMode` - facing mode for a camera (possible values - 'user', 'environment')
- firePermissionPromptIsShownEvent - optional boolean parameter. If set to `true`, `JitsiMediaDevicesEvents.PERMISSION_PROMPT_IS_SHOWN` will be fired when browser shows gUM permission prompt.
* ```JitsiMeetJS.createTrackVADEmitter(localAudioDeviceId, sampleRate, vadProcessor)``` - Creates a TrackVADEmitter service that connects an audio track to a VAD (voice activity detection) processor in order to obtain VAD scores for individual PCM audio samples.
- ```localAudioDeviceId``` - The target local audio device.
- ```sampleRate``` - Sample rate at which the emitter will operate. Possible values 256, 512, 1024, 4096, 8192, 16384. Passing other values will default to closes neighbor, i.e. Providing a value of 4096 means that the emitter will process bundles of 4096 PCM samples at a time, higher values mean longer calls, lowers values mean more calls but shorter.
- ```vadProcessor``` - VAD Processors that does the actual compute on a PCM sample.The processor needs to implement the following functions:
- getSampleLength() - Returns the sample size accepted by calculateAudioFrameVAD.
- getRequiredPCMFrequency() - Returns the PCM frequency at which the processor operates .i.e. (16KHz, 44.1 KHz etc.)
- calculateAudioFrameVAD(pcmSample) - Process a 32 float pcm sample of getSampleLength size.
* ```JitsiMeetJS.enumerateDevices(callback)``` - __DEPRECATED__. Use ```JitsiMeetJS.mediaDevices.enumerateDevices(callback)``` instead.
* ```JitsiMeetJS.isDeviceChangeAvailable(deviceType)``` - __DEPRECATED__. Use ```JitsiMeetJS.mediaDevices.isDeviceChangeAvailable(deviceType)``` instead.
* ```JitsiMeetJS.isDesktopSharingEnabled()``` - returns true if desktop sharing is supported and false otherwise. NOTE: that method can be used after ```JitsiMeetJS.init(options)``` is completed otherwise the result will be always null.
* ```JitsiMeetJS.getActiveAudioDevice()``` - goes through all audio devices on the system and returns information about one that is active, i.e. has audio signal. Returns a Promise resolving to an Object with the following structure:
- deviceId - string containing the device ID of the audio track found as active.
- deviceLabel - string containing the label of the audio device.
* ```JitsiMeetJS.getGlobalOnErrorHandler()``` - returns function that can be used to be attached to window.onerror and if options.enableWindowOnErrorHandler is enabled returns the function used by the lib. (function(message, source, lineno, colno, error)).
* `JitsiMeetJS.createTrackVADEmitter(localAudioDeviceId, sampleRate, vadProcessor)` - Creates a TrackVADEmitter service that connects an audio track to a VAD (voice activity detection) processor in order to obtain VAD scores for individual PCM audio samples.
- `localAudioDeviceId` - The target local audio device.
- `sampleRate` - Sample rate at which the emitter will operate. Possible values 256, 512, 1024, 4096, 8192, 16384. Passing other values will default to closes neighbor, i.e. Providing a value of 4096 means that the emitter will process bundles of 4096 PCM samples at a time, higher values mean longer calls, lowers values mean more calls but shorter.
- `vadProcessor` - VAD Processors that does the actual compute on a PCM sample.The processor needs to implement the following functions:
- `getSampleLength()` - Returns the sample size accepted by calculateAudioFrameVAD.
- `getRequiredPCMFrequency()` - Returns the PCM frequency at which the processor operates .i.e. (16KHz, 44.1 KHz etc.)
- `calculateAudioFrameVAD(pcmSample)` - Process a 32 float pcm sample of getSampleLength size.
* `JitsiMeetJS.enumerateDevices(callback)` - __DEPRECATED__. Use `JitsiMeetJS.mediaDevices.enumerateDevices(callback)` instead.
* `JitsiMeetJS.isDeviceChangeAvailable(deviceType)` - __DEPRECATED__. Use `JitsiMeetJS.mediaDevices.isDeviceChangeAvailable(deviceType)` instead.
* `JitsiMeetJS.isDesktopSharingEnabled()` - returns true if desktop sharing is supported and false otherwise. NOTE: that method can be used after `JitsiMeetJS.init(options)` is completed otherwise the result will be always null.
* `JitsiMeetJS.getActiveAudioDevice()` - goes through all audio devices on the system and returns information about one that is active, i.e. has audio signal. Returns a Promise resolving to an Object with the following structure:
- `deviceId` - string containing the device ID of the audio track found as active.
- `deviceLabel` - string containing the label of the audio device.
* `JitsiMeetJS.getGlobalOnErrorHandler()` - returns function that can be used to be attached to window.onerror and if options.enableWindowOnErrorHandler is enabled returns the function used by the lib. (function(message, source, lineno, colno, error)).
* ```JitsiMeetJS.mediaDevices``` - JS object that contains methods for interaction with media devices. Following methods are available:
- ```isDeviceListAvailable()``` - returns true if retrieving the device list is supported and false - otherwise
- ```isDeviceChangeAvailable(deviceType)``` - returns true if changing the input (camera / microphone) or output (audio) device is supported and false if not. ```deviceType``` is a type of device to change. Undefined or 'input' stands for input devices, 'output' - for audio output devices.
- ```enumerateDevices(callback)``` - returns list of the available devices as a parameter to the callback function. Every device is a MediaDeviceInfo object with the following properties:
- label - the name of the device
- kind - "audioinput", "videoinput" or "audiooutput"
- deviceId - the id of the device
- groupId - group identifier, two devices have the same group identifier if they belong to the same physical device; for example a monitor with both a built-in camera and microphone
- ```setAudioOutputDevice(deviceId)``` - sets current audio output device. ```deviceId``` - id of 'audiooutput' device from ```JitsiMeetJS.enumerateDevices()```, '' is for default device.
- ```getAudioOutputDevice()``` - returns currently used audio output device id, '' stands for default device.
- ```isDevicePermissionGranted(type)``` - returns a Promise which resolves to true if user granted permission to media devices. ```type``` - 'audio', 'video' or ```undefined```. In case of ```undefined``` will check if both audio and video permissions were granted.
- ```addEventListener(event, handler)``` - attaches an event handler.
- ```removeEventListener(event, handler)``` - removes an event handler.
* `JitsiMeetJS.mediaDevices` - JS object that contains methods for interaction with media devices. Following methods are available:
- `isDeviceListAvailable()` - returns true if retrieving the device list is supported and false - otherwise
- `isDeviceChangeAvailable(deviceType)` - returns true if changing the input (camera / microphone) or output (audio) device is supported and false if not. `deviceType` is a type of device to change. Undefined or 'input' stands for input devices, 'output' - for audio output devices.
- `enumerateDevices(callback)` - returns list of the available devices as a parameter to the callback function. Every device is a MediaDeviceInfo object with the following properties:
- `label` - the name of the device
- `kind` - "audioinput", "videoinput" or "audiooutput"
- `deviceId` - the id of the device
- `groupId` - group identifier, two devices have the same group identifier if they belong to the same physical device; for example a monitor with both a built-in camera and microphone
- `setAudioOutputDevice(deviceId)` - sets current audio output device. `deviceId` - id of 'audiooutput' device from `JitsiMeetJS.enumerateDevices()`, '' is for default device.
- `getAudioOutputDevice()` - returns currently used audio output device id, '' stands for default device.
- `isDevicePermissionGranted(type)` - returns a Promise which resolves to true if user granted permission to media devices. `type` - 'audio', 'video' or `undefined`. In case of `undefined` will check if both audio and video permissions were granted.
- `addEventListener(event, handler)` - attaches an event handler.
- `removeEventListener(event, handler)` - removes an event handler.
* ```JitsiMeetJS.events``` - JS object that contains all events used by the API. You will need that JS object when you try to subscribe for connection or conference events.
We have two event types - connection and conference. You can access the events with the following code ```JitsiMeetJS.events.<event_type>.<event_name>```.
For example if you want to use the conference event that is fired when somebody leave conference you can use the following code - ```JitsiMeetJS.events.conference.USER_LEFT```.
* `JitsiMeetJS.events` - JS object that contains all events used by the API. You will need that JS object when you try to subscribe for connection or conference events.
We have two event types - connection and conference. You can access the events with the following code `JitsiMeetJS.events.<event_type>.<event_name>`.
For example if you want to use the conference event that is fired when somebody leave conference you can use the following code - `JitsiMeetJS.events.conference.USER_LEFT`.
We support the following events:
1. conference
- TRACK_ADDED - stream received. (parameters - JitsiTrack)
- TRACK_REMOVED - stream removed. (parameters - JitsiTrack)
- TRACK_MUTE_CHANGED - JitsiTrack was muted or unmuted. (parameters - JitsiTrack)
- TRACK_AUDIO_LEVEL_CHANGED - audio level of JitsiTrack has changed. (parameters - participantId(string), audioLevel(number))
- DOMINANT_SPEAKER_CHANGED - the dominant speaker is changed. (parameters - id(string))
- USER_JOINED - new user joined a conference. (parameters - id(string), user(JitsiParticipant))
- USER_LEFT - a participant left conference. (parameters - id(string), user(JitsiParticipant))
- MESSAGE_RECEIVED - new text message received. (parameters - id(string), text(string), ts(number))
- DISPLAY_NAME_CHANGED - user has changed his display name. (parameters - id(string), displayName(string))
- SUBJECT_CHANGED - notifies that subject of the conference has changed (parameters - subject(string))
- LAST_N_ENDPOINTS_CHANGED - last n set was changed (parameters - leavingEndpointIds(array) ids of users leaving lastN, enteringEndpointIds(array) ids of users entering lastN)
- CONFERENCE_JOINED - notifies the local user that he joined the conference successfully. (no parameters)
- CONFERENCE_LEFT - notifies the local user that he left the conference successfully. (no parameters)
- DTMF_SUPPORT_CHANGED - notifies if at least one user supports DTMF. (parameters - supports(boolean))
- USER_ROLE_CHANGED - notifies that role of some user changed. (parameters - id(string), role(string))
- USER_STATUS_CHANGED - notifies that status of some user changed. (parameters - id(string), status(string))
- CONFERENCE_FAILED - notifies that user failed to join the conference. (parameters - errorCode(JitsiMeetJS.errors.conference))
- CONFERENCE_ERROR - notifies that error occurred. (parameters - errorCode(JitsiMeetJS.errors.conference))
- KICKED - notifies that user has been kicked from the conference.
- START_MUTED_POLICY_CHANGED - notifies that all new participants will join with muted audio/video stream (parameters - JS object with 2 properties - audio(boolean), video(boolean))
- STARTED_MUTED - notifies that the local user has started muted
- CONNECTION_STATS - __DEPRECATED__. Use ```JitsiMeetJS.connectionQuality.LOCAL_STATS_UPDATED``` instead.
- BEFORE_STATISTICS_DISPOSED - fired just before the statistics module is disposed and it's the last chance to submit some logs to the statistics service, before it gets disconnected
- AUTH_STATUS_CHANGED - notifies that authentication is enabled or disabled, or local user authenticated (logged in). (parameters - isAuthEnabled(boolean), authIdentity(string))
- ENDPOINT_MESSAGE_RECEIVED - notifies that a new message
1. `conference`
- `TRACK_ADDED` - stream received. (parameters - JitsiTrack)
- `TRACK_REMOVED` - stream removed. (parameters - JitsiTrack)
- `TRACK_MUTE_CHANGED` - JitsiTrack was muted or unmuted. (parameters - JitsiTrack)
- `TRACK_AUDIO_LEVEL_CHANGED` - audio level of JitsiTrack has changed. (parameters - participantId(string), audioLevel(number))
- `DOMINANT_SPEAKER_CHANGED` - the dominant speaker is changed. (parameters - id(string))
- `USER_JOINED` - new user joined a conference. (parameters - id(string), user(JitsiParticipant))
- `USER_LEFT` - a participant left conference. (parameters - id(string), user(JitsiParticipant))
- `MESSAGE_RECEIVED` - new text message received. (parameters - id(string), text(string), ts(number))
- `DISPLAY_NAME_CHANGED` - user has changed his display name. (parameters - id(string), displayName(string))
- `SUBJECT_CHANGED` - notifies that subject of the conference has changed (parameters - subject(string))
- `LAST_N_ENDPOINTS_CHANGED` - last n set was changed (parameters - leavingEndpointIds(array) ids of users leaving lastN, enteringEndpointIds(array) ids of users entering lastN)
- `CONFERENCE_JOINED` - notifies the local user that he joined the conference successfully. (no parameters)
- `CONFERENCE_LEFT` - notifies the local user that he left the conference successfully. (no parameters)
- `DTMF_SUPPORT_CHANGED` - notifies if at least one user supports DTMF. (parameters - supports(boolean))
- `USER_ROLE_CHANGED` - notifies that role of some user changed. (parameters - id(string), role(string))
- `USER_STATUS_CHANGED` - notifies that status of some user changed. (parameters - id(string), status(string))
- `CONFERENCE_FAILED` - notifies that user failed to join the conference. (parameters - errorCode(JitsiMeetJS.errors.conference))
- `CONFERENCE_ERROR` - notifies that error occurred. (parameters - errorCode(JitsiMeetJS.errors.conference))
- `KICKED` - notifies that user has been kicked from the conference.
- `START_MUTED_POLICY_CHANGED` - notifies that all new participants will join with muted audio/video stream (parameters - JS object with 2 properties - audio(boolean), video(boolean))
- `STARTED_MUTED` - notifies that the local user has started muted
- `CONNECTION_STATS` - __DEPRECATED__. Use `JitsiMeetJS.connectionQuality.LOCAL_STATS_UPDATED` instead.
- `BEFORE_STATISTICS_DISPOSED` - fired just before the statistics module is disposed and it's the last chance to submit some logs to the statistics service, before it gets disconnected
- `AUTH_STATUS_CHANGED` - notifies that authentication is enabled or disabled, or local user authenticated (logged in). (parameters - isAuthEnabled(boolean), authIdentity(string))
- `ENDPOINT_MESSAGE_RECEIVED` - notifies that a new message
from another participant is received on a data channel.
- TALK_WHILE_MUTED - notifies that a local user is talking while having the microphone muted.
- NO_AUDIO_INPUT - notifies that the current selected input device has no signal.
- AUDIO_INPUT_STATE_CHANGE - notifies that the current conference audio input switched between audio input states i.e. with or without audio input.
- NOISY_MIC - notifies that the current microphone used by the conference is noisy.
- `TALK_WHILE_MUTED` - notifies that a local user is talking while having the microphone muted.
- `NO_AUDIO_INPUT` - notifies that the current selected input device has no signal.
- `AUDIO_INPUT_STATE_CHANGE` - notifies that the current conference audio input switched between audio input states i.e. with or without audio input.
- `NOISY_MIC` - notifies that the current microphone used by the conference is noisy.
- `PARTICIPANT_PROPERTY_CHANGED` - notifies that user has changed his custom participant property. (parameters - user(JitsiParticipant), propertyKey(string), oldPropertyValue(string), propertyValue(string))
2. connection
- CONNECTION_FAILED - indicates that the server connection failed.
- CONNECTION_ESTABLISHED - indicates that we have successfully established server connection.
- CONNECTION_DISCONNECTED - indicates that we are disconnected.
- WRONG_STATE - indicates that the user has performed action that can't be executed because the connection is in wrong state.
2. `connection`
- `CONNECTION_FAILED` - indicates that the server connection failed.
- `CONNECTION_ESTABLISHED` - indicates that we have successfully established server connection.
- `CONNECTION_DISCONNECTED` - indicates that we are disconnected.
- `WRONG_STATE` - indicates that the user has performed action that can't be executed because the connection is in wrong state.
3. detection
- VAD_SCORE_PUBLISHED - event generated by a TackVADEmitter when it computed a VAD score for an audio PCM sample.
3. `detection`
- `VAD_SCORE_PUBLISHED` - event generated by a TackVADEmitter when it computed a VAD score for an audio PCM sample.
3. tracks
- LOCAL_TRACK_STOPPED - indicates that a local track was stopped. This
event can be fired when ```dispose()``` method is called or for other reasons.
- TRACK_AUDIO_OUTPUT_CHANGED - indicates that audio output device for track was changed (parameters - deviceId (string) - new audio output device ID).
4. `track`
- `LOCAL_TRACK_STOPPED` - indicates that a local track was stopped. This
event can be fired when `dispose()` method is called or for other reasons.
- `TRACK_AUDIO_OUTPUT_CHANGED` - indicates that audio output device for track was changed (parameters - deviceId (string) - new audio output device ID).
4. mediaDevices
- DEVICE_LIST_CHANGED - indicates that list of currently connected devices has changed (parameters - devices(MediaDeviceInfo[])).
- PERMISSION_PROMPT_IS_SHOWN - Indicates that the environment is currently showing permission prompt to access camera and/or microphone (parameters - environmentType ('chrome'|'opera'|'firefox'|'safari'|'nwjs'|'react-native'|'android').
5. `mediaDevices`
- `DEVICE_LIST_CHANGED` - indicates that list of currently connected devices has changed (parameters - devices(MediaDeviceInfo[])).
- `PERMISSION_PROMPT_IS_SHOWN` - Indicates that the environment is currently showing permission prompt to access camera and/or microphone (parameters - environmentType ('chrome'|'opera'|'firefox'|'safari'|'nwjs'|'react-native'|'android').
5. connectionQuality
- LOCAL_STATS_UPDATED - New local connection statistics are received. (parameters - stats(object))
- REMOTE_STATS_UPDATED - New remote connection statistics are received. (parameters - id(string), stats(object))
6. `connectionQuality`
- `LOCAL_STATS_UPDATED` - New local connection statistics are received. (parameters - stats(object))
- `REMOTE_STATS_UPDATED` - New remote connection statistics are received. (parameters - id(string), stats(object))
* ```JitsiMeetJS.errors``` - JS object that contains all errors used by the API. You can use that object to check the reported errors from the API
We have three error types - connection, conference and track. You can access the events with the following code ```JitsiMeetJS.errors.<error_type>.<error_name>```.
For example if you want to use the conference event that is fired when somebody leave conference you can use the following code - ```JitsiMeetJS.errors.conference.PASSWORD_REQUIRED```.
* `JitsiMeetJS.errors` - JS object that contains all errors used by the API. You can use that object to check the reported errors from the API
We have three error types - connection, conference and track. You can access the events with the following code `JitsiMeetJS.errors.<error_type>.<error_name>`.
For example if you want to use the conference event that is fired when somebody leave conference you can use the following code - `JitsiMeetJS.errors.conference.PASSWORD_REQUIRED`.
We support the following errors:
1. conference
- CONNECTION_ERROR - the connection with the conference is lost.
- SETUP_FAILED - conference setup failed
- AUTHENTICATION_REQUIRED - user must be authenticated to create this conference
- PASSWORD_REQUIRED - that error can be passed when the connection to the conference failed. You should try to join the conference with password.
- PASSWORD_NOT_SUPPORTED - indicates that conference cannot be locked
- VIDEOBRIDGE_NOT_AVAILABLE - video bridge issues.
- RESERVATION_ERROR - error in reservation system
- GRACEFUL_SHUTDOWN - graceful shutdown
- JINGLE_FATAL_ERROR - error in jingle (the orriginal error is attached as parameter.)
- CONFERENCE_DESTROYED - conference has been destroyed
- CHAT_ERROR - chat error happened
- FOCUS_DISCONNECTED - focus error happened
- FOCUS_DISCONNECTED - focus left the conference
- CONFERENCE_MAX_USERS - The maximum users limit has been reached
2. connection
- CONNECTION_DROPPED_ERROR - indicates that the connection was dropped with an error which was most likely caused by some networking issues.
- PASSWORD_REQUIRED - passed when the connection to the server failed. You should try to authenticate with password.
- SERVER_ERROR - indicates too many 5XX errors were received from the server.
- OTHER_ERROR - all other errors
3. track
- GENERAL - generic getUserMedia-related error.
- UNSUPPORTED_RESOLUTION - getUserMedia-related error, indicates that requested video resolution is not supported by camera.
- PERMISSION_DENIED - getUserMedia-related error, indicates that user denied permission to share requested device.
- NOT_FOUND - getUserMedia-related error, indicates that requested device was not found.
- CONSTRAINT_FAILED - getUserMedia-related error, indicates that some of requested constraints in getUserMedia call were not satisfied.
- TRACK_IS_DISPOSED - an error which indicates that track has been already disposed and cannot be longer used.
- TRACK_NO_STREAM_FOUND - an error which indicates that track has no MediaStream associated.
- CHROME_EXTENSION_GENERIC_ERROR - generic error for jidesha extension for Chrome.
- CHROME_EXTENSION_USER_CANCELED - an error which indicates that user canceled screen sharing window selection dialog in jidesha extension for Chrome.
- CHROME_EXTENSION_INSTALLATION_ERROR - an error which indicates that the jidesha extension for Chrome is failed to install.
- FIREFOX_EXTENSION_NEEDED - An error which indicates that the jidesha extension for Firefox is needed to proceed with screen sharing, and that it is not installed.
1. `conference`
- `CONNECTION_ERROR` - the connection with the conference is lost.
- `SETUP_FAILED` - conference setup failed
- `AUTHENTICATION_REQUIRED` - user must be authenticated to create this conference
- `PASSWORD_REQUIRED` - that error can be passed when the connection to the conference failed. You should try to join the conference with password.
- `PASSWORD_NOT_SUPPORTED` - indicates that conference cannot be locked
- `VIDEOBRIDGE_NOT_AVAILABLE` - video bridge issues.
- `RESERVATION_ERROR` - error in reservation system
- `GRACEFUL_SHUTDOWN` - graceful shutdown
- `JINGLE_FATAL_ERROR` - error in jingle (the orriginal error is attached as parameter.)
- `CONFERENCE_DESTROYED` - conference has been destroyed
- `CHAT_ERROR` - chat error happened
- `FOCUS_DISCONNECTED` - focus error happened
- `FOCUS_DISCONNECTED` - focus left the conference
- `CONFERENCE_MAX_USERS` - The maximum users limit has been reached
2. `connection`
- `CONNECTION_DROPPED_ERROR` - indicates that the connection was dropped with an error which was most likely caused by some networking issues.
- `PASSWORD_REQUIRED` - passed when the connection to the server failed. You should try to authenticate with password.
- `SERVER_ERROR` - indicates too many 5XX errors were received from the server.
- `OTHER_ERROR` - all other errors
3. `track`
- `GENERAL` - generic getUserMedia-related error.
- `UNSUPPORTED_RESOLUTION` - getUserMedia-related error, indicates that requested video resolution is not supported by camera.
- `PERMISSION_DENIED` - getUserMedia-related error, indicates that user denied permission to share requested device.
- `NOT_FOUND` - getUserMedia-related error, indicates that requested device was not found.
- `CONSTRAINT_FAILED` - getUserMedia-related error, indicates that some of requested constraints in getUserMedia call were not satisfied.
- `TRACK_IS_DISPOSED` - an error which indicates that track has been already disposed and cannot be longer used.
- `TRACK_NO_STREAM_FOUND` - an error which indicates that track has no MediaStream associated.
- `SCREENSHARING_GENERIC_ERROR` - generic error for screensharing.
- `SCREENSHARING_USER_CANCELED` - an error which indicates that user canceled screen sharing window selection dialog.
* ```JitsiMeetJS.errorTypes``` - constructors for Error instances that can be produced by library. Are useful for checks like ```error instanceof JitsiMeetJS.errorTypes.JitsiTrackError```. Following Errors are available:
1. ```JitsiTrackError``` - Error that happened to a JitsiTrack.
* `JitsiMeetJS.errorTypes` - constructors for Error instances that can be produced by library. Are useful for checks like `error instanceof JitsiMeetJS.errorTypes.JitsiTrackError`. Following Errors are available:
1. `JitsiTrackError` - Error that happened to a JitsiTrack.
* ```JitsiMeetJS.logLevels``` - object with the log levels:
1. TRACE
2. DEBUG
3. INFO
4. LOG
5. WARN
6. ERROR
* `JitsiMeetJS.logLevels` - object with the log levels:
1. `TRACE`
2. `DEBUG`
3. `INFO`
4. `LOG`
5. `WARN`
6. `ERROR`
JitsiConnection
------------
This objects represents the server connection. You can create new ```JitsiConnection``` object with the constructor ```JitsiMeetJS.JitsiConnection```. ```JitsiConnection``` has the following methods:
This objects represents the server connection. You can create new `JitsiConnection` object with the constructor `JitsiMeetJS.JitsiConnection`. `JitsiConnection` has the following methods:
1. ```JitsiConnection(appID, token, options)``` - constructor. Creates the conference object.
1. `JitsiConnection(appID, token, options)` - constructor. Creates the conference object.
- appID - identification for the provider of Jitsi Meet video conferencing services. **NOTE: not implemented yet. You can safely pass ```null```**
- token - secret generated by the provider of Jitsi Meet video conferencing services. The token will be send to the provider from the Jitsi Meet server deployment for authorization of the current client.
- options - JS object with configuration options for the server connection. You can change the following properties there:
1. serviceUrl - XMPP service URL. For example 'wss://server.com/xmpp-websocket' for Websocket or '//server.com/http-bind' for BOSH.
2. bosh - DEPRECATED, use serviceUrl to specify either BOSH or Websocket URL.
3. hosts - JS Object
- domain
- muc
- anonymousdomain
4. useStunTurn -
5. enableLipSync - (optional) boolean property which enables the lipsync feature. Currently works only in Chrome and is enabled by default.
- `appID` - identification for the provider of Jitsi Meet video conferencing services. **NOTE: not implemented yet. You can safely pass `null`**
- `token` - secret generated by the provider of Jitsi Meet video conferencing services. The token will be send to the provider from the Jitsi Meet server deployment for authorization of the current client.
- `options` - JS object with configuration options for the server connection. You can change the following properties there:
1. `serviceUrl` - XMPP service URL. For example 'wss://server.com/xmpp-websocket' for Websocket or '//server.com/http-bind' for BOSH.
2. `bosh` - DEPRECATED, use serviceUrl to specify either BOSH or Websocket URL.
3. `hosts` - JS Object
- `domain`
- `muc`
- `anonymousdomain`
4. `enableLipSync` - (optional) boolean property which enables the lipsync feature. Currently works only in Chrome and is disabled by default.
5. `clientNode` - The name of client node advertised in XEP-0115 'c' stanza
2. connect(options) - establish server connection
- options - JS Object with ```id``` and ```password``` properties.
2. `connect(options)` - establish server connection
- `options` - JS Object with `id` and `password` properties.
3. disconnect() - destroys the server connection
3. `disconnect()` - destroys the server connection
4. initJitsiConference(name, options) - creates new ```JitsiConference``` object.
- name - the name of the conference
- options - JS object with configuration options for the conference. You can change the following properties there:
- openBridgeChannel - Enables/disables bridge channel. Values can be "datachannel", "websocket", true (treat it as "datachannel"), undefined (treat it as "datachannel") and false (don't open any channel). **NOTE: we recommend to set that option to true**
- recordingType - the type of recording to be used
- callStatsID - callstats credentials
- callStatsSecret - callstats credentials
- enableTalkWhileMuted - boolean property. Enables/disables talk while muted detection, by default the value is false/disabled.
- ignoreStartMuted - ignores start muted events coming from jicofo.
- startSilent - enables silent mode, will mark audio as inactive will not send/receive audio
- confID - Used for statistics to identify conference, if tenants are supported will contain tenant and the non lower case variant for the room name.
- statisticsId - The id to be used as stats instead of default callStatsUsername.
- statisticsDisplayName - The display name to be used for stats, used for callstats.
4. `initJitsiConference(name, options)` - creates new `JitsiConference` object.
- `name` - the name of the conference
- `options` - JS object with configuration options for the conference. You can change the following properties there:
- `openBridgeChannel` - Enables/disables bridge channel. Values can be "datachannel", "websocket", true (treat it as "datachannel"), undefined (treat it as "datachannel") and false (don't open any channel). **NOTE: we recommend to set that option to true**
- `recordingType` - the type of recording to be used
- `callStatsID` - callstats credentials
- `callStatsSecret` - callstats credentials
- `enableTalkWhileMuted` - boolean property. Enables/disables talk while muted detection, by default the value is false/disabled.
- `ignoreStartMuted` - ignores start muted events coming from jicofo.
- `startSilent` - enables silent mode, will mark audio as inactive will not send/receive audio
- `confID` - Used for statistics to identify conference, if tenants are supported will contain tenant and the non lower case variant for the room name.
- `siteID` - (optional) Used for statistics to identify the site where the user is coming from, if tenants are supported it will contain a unique identifier for that tenant. If not provided, the value will be infered from confID
- `statisticsId` - The id to be used as stats instead of default callStatsUsername.
- `statisticsDisplayName` - The display name to be used for stats, used for callstats.
- `focusUserJid` - The real JID of focus participant - can be overridden here
- `enableNoAudioDetection`
- `enableNoisyMicDetection`
- `enableRemb`
- `enableTcc`
- `useRoomAsSharedDocumentName`
- `channelLastN`
- `startBitrate`
- `stereo`
- `forceJVB121Ratio` - "Math.random() < forceJVB121Ratio" will determine whether a 2 people conference should be moved to the JVB instead of P2P. The decision is made on the responder side, after ICE succeeds on the P2P connection.
- `hiddenDomain`
- `startAudioMuted`
- `startVideoMuted`
- `enableLayerSuspension` - if set to 'true', we will cap the video send bitrate when we are told we have not been selected by any endpoints (and therefore the non-thumbnail streams are not in use).
- `deploymentInfo`
- `shard`
- `userRegion`
- `p2p` - Peer to peer related options
- `enabled` - enables or disable peer-to-peer connection, if disabled all media will be routed through the Jitsi Videobridge.
- `stunServers` - list of STUN servers e.g. `{ urls: 'stun:meet-jit-si-turnrelay.jitsi.net:443' }`
- `backToP2PDelay` - a delay given in seconds, before the conference switches back to P2P, after the 3rd participant has left the room.
- `disabledCodec` - the mime type of the code that should not be negotiated on the peerconnection.
- `preferredCodec` the mime type of the codec that needs to be made the preferred codec for the connection.
- `disableH264` - __DEPRECATED__. Use `disabledCodec` instead.
- `preferH264` - __DEPRECATED__. Use `preferredCodec` instead.
- `rttMonitor`
- `enabled`
- `initialDelay`
- `getStatsInterval`
- `analyticsInterval`
- `stunServers`
- `e2eping`
- `pingInterval`
- `abTesting` - A/B testing related options
- `enableSuspendVideoTest`
- `testing`
- `capScreenshareBitrate`
- `p2pTestMode`
- `octo`
- `probability`
**NOTE: if 4 and 5 are set the library is going to send events to callstats. Otherwise the callstats integration will be disabled.**
5. addEventListener(event, listener) - Subscribes the passed listener to the event.
- event - one of the events from ```JitsiMeetJS.events.connection``` object.
- listener - handler for the event.
5. `addEventListener(event, listener)` - Subscribes the passed listener to the event.
- `event` - one of the events from `JitsiMeetJS.events.connection` object.
- `listener` - handler for the event.
6. removeEventListener(event, listener) - Removes event listener.
- event - the event
- listener - the listener that will be removed.
6. `removeEventListener(event, listener)` - Removes event listener.
- `event` - the event
- `listener` - the listener that will be removed.
7. addFeature - Adds new feature to the list of supported features for the local participant
- feature - string, the name of the feature
- submit - boolean, default false, if true - the new list of features will be immediately submitted to the others.
7. `addFeature` - Adds new feature to the list of supported features for the local participant
- `feature` - string, the name of the feature
- `submit` - boolean, default false, if true - the new list of features will be immediately submitted to the others.
8. removeFeature - Removes a feature from the list of supported features for the local participant
- feature - string, the name of the feature
- submit - boolean, default false, if true - the new list of features will be immediately submitted to the others.
8. `removeFeature` - Removes a feature from the list of supported features for the local participant
- `feature` - string, the name of the feature
- `submit` - boolean, default false, if true - the new list of features will be immediately submitted to the others.

@@ -276,30 +317,30 @@ JitsiConference

1. join(password) - Joins the conference
1. `join(password)` - Joins the conference
- password - string of the password. This parameter is not mandatory.
2. leave() - leaves the conference. Returns Promise.
2. `leave()` - leaves the conference. Returns Promise.
3. myUserId() - get local user ID.
3. `myUserId()` - get local user ID.
4. getLocalTracks() - Returns array with JitsiTrack objects for the local streams.
4. `getLocalTracks()` - Returns array with JitsiTrack objects for the local streams.
5. addEventListener(event, listener) - Subscribes the passed listener to the event.
- event - one of the events from ```JitsiMeetJS.events.conference``` object.
- listener - handler for the event.
5. `addEventListener(event, listener)` - Subscribes the passed listener to the event.
- `event` - one of the events from `JitsiMeetJS.events.conference` object.
- `listener` - handler for the event.
6. removeEventListener(event, listener) - Removes event listener.
- event - the event
- listener - the listener that will be removed.
6. `removeEventListener(event, listener)` - Removes event listener.
- `event` - the event
- `listener` - the listener that will be removed.
7. on(event, listener) - alias for addEventListener
7. `on(event, listener)` - alias for addEventListener
8. off(event, listener) - alias for removeEventListener
8. `off(event, listener)` - alias for removeEventListener
9. sendTextMessage(text) - sends the given string to other participants in the conference.
9. `sendTextMessage(text)` - sends the given string to other participants in the conference.
10. setDisplayName(name) - changes the display name of the local participant.
- name - the new display name
10. `setDisplayName(name)` - changes the display name of the local participant.
- `name` - the new display name
11. selectParticipant(participantId) - Elects the participant with the given id to be the selected participant in order to receive higher video quality (if simulcast is enabled).
- participantId - the identifier of the participant
11. `selectParticipant(participantId)` - Elects the participant with the given id to be the selected participant in order to receive higher video quality (if simulcast is enabled).
- `participantId` - the identifier of the participant

@@ -309,8 +350,8 @@ Throws NetworkError or InvalidStateError or Error if the operation fails.

12. sendCommand(name, values) - sends user defined system command to the other participants
- name - the name of the command.
- values - JS object. The object has the following structure:
12. `sendCommand(name, values)` - sends user defined system command to the other participants
- `name` - the name of the command.
- `values` - JS object. The object has the following structure:
```
```javascript
{

@@ -322,3 +363,3 @@

attributes: {},// map with keys the name of the attribute and values - the values of the attributes.
attributes: {}, // map with keys the name of the attribute and values - the values of the attributes.

@@ -334,85 +375,91 @@

13. sendCommandOnce(name, values) - Sends only one time a user defined system command to the other participants
13. `sendCommandOnce(name, values)` - Sends only one time a user defined system command to the other participants
14. removeCommand(name) - removes a command for the list of the commands that are sent to the ther participants
- name - the name of the command
14. `removeCommand(name)` - removes a command for the list of the commands that are sent to the ther participants
- `name` - the name of the command
15. addCommandListener(command, handler) - adds listener
- command - string for the name of the command
- handler(values) - the listener that will be called when a command is received from another participant.
15. `addCommandListener(command, handler)` - adds listener
- `command` - string for the name of the command
- `handler(values)` - the listener that will be called when a command is received from another participant.
16. removeCommandListener(command) - removes the listeners for the specified command
- command - the name of the command
16. `removeCommandListener(command)` - removes the listeners for the specified command
- `command` - the name of the command
17. addTrack(track) - Adds JitsiLocalTrack object to the conference. Throws an error if adding second video stream. Returns Promise.
- track - the JitsiLocalTrack
17. `addTrack(track)` - Adds `JitsiLocalTrack` object to the conference. Throws an error if adding second video stream. Returns Promise.
- `track` - the `JitsiLocalTrack`
18. removeTrack(track) - Removes JitsiLocalTrack object to the conference. Returns Promise.
- track - the JitsiLocalTrack
18. `removeTrack(track)` - Removes `JitsiLocalTrack` object to the conference. Returns Promise.
- `track` - the `JitsiLocalTrack`
19. isDTMFSupported() - Check if at least one user supports DTMF.
19. `isDTMFSupported()` - Check if at least one user supports DTMF.
20. getRole() - returns string with the local user role ("moderator" or "none")
20. `getRole()` - returns string with the local user role ("moderator" or "none")
21. isModerator() - checks if local user has "moderator" role
21. `isModerator()` - checks if local user has "moderator" role
22. lock(password) - set password for the conference; returns Promise
- password - string password
22. `lock(password)` - set password for the conference; returns Promise
- `password` - string password
Note: available only for moderator
23. unlock() - unset conference password; returns Promise
23. `unlock()` - unset conference password; returns Promise
Note: available only for moderator
24. kick(id) - Kick participant from the conference
- id - string participant id
24. `kickParticipant(id)` - Kick participant from the conference
- `id` - string participant id
25. setStartMutedPolicy(policy) - make all new participants join with muted audio/video
- policy - JS object with following properties
- audio - boolean if audio stream should be muted
- video - boolean if video stream should be muted
25. `setStartMutedPolicy(policy)` - make all new participants join with muted audio/video
- `policy` - JS object with following properties
- `audio` - boolean if audio stream should be muted
- `video` - boolean if video stream should be muted
Note: available only for moderator
26. getStartMutedPolicy() - returns the current policy with JS object:
- policy - JS object with following properties
- audio - boolean if audio stream should be muted
- video - boolean if video stream should be muted
26. `getStartMutedPolicy()` - returns the current policy with JS object:
- `policy` - JS object with following properties
- `audio` - boolean if audio stream should be muted
- `video` - boolean if video stream should be muted
27. isStartAudioMuted() - check if audio is muted on join
27. `isStartAudioMuted()` - check if audio is muted on join
28. isStartVideoMuted() - check if video is muted on join
28. `isStartVideoMuted()` - check if video is muted on join
29. sendFeedback(overallFeedback, detailedFeedback) - Sends the given feedback through CallStats if enabled.
- overallFeedback an integer between 1 and 5 indicating the user feedback
- detailedFeedback detailed feedback from the user. Not yet used
29. `sendFeedback(overallFeedback, detailedFeedback)` - Sends the given feedback through CallStats if enabled.
- `overallFeedback` - an integer between 1 and 5 indicating the user feedback
- `detailedFeedback` - detailed feedback from the user. Not yet used
30. setSubject(subject) - change subject of the conference
- subject - string new subject
30. `setSubject(subject)` - change subject of the conference
- `subject` - string new subject
Note: available only for moderator
31. sendEndpointMessage(to, payload) - Sends message via the data channels.
- to - the id of the endpoint that should receive the message. If "" the message will be sent to all participants.
- payload - JSON object - the payload of the message.
31. `sendEndpointMessage(to, payload)` - Sends message via the data channels.
- `to` - the id of the endpoint that should receive the message. If "" the message will be sent to all participants.
- `payload` - JSON object - the payload of the message.
Throws NetworkError or InvalidStateError or Error if the operation fails.
32. broadcastEndpointMessage(payload) - Sends broadcast message via the datachannels.
- payload - JSON object - the payload of the message.
32. `broadcastEndpointMessage(payload)` - Sends broadcast message via the datachannels.
- `payload` - JSON object - the payload of the message.
Throws NetworkError or InvalidStateError or Error if the operation fails.
33. pinParticipant(participantId) - Elects the participant with the given id to be the pinned participant in order to always receive video for this participant (even when last n is enabled).
- participantId - the identifier of the participant
33. `pinParticipant(participantId)` - Elects the participant with the given id to be the pinned participant in order to always receive video for this participant (even when last n is enabled).
- `participantId` - the identifier of the participant
Throws NetworkError or InvalidStateError or Error if the operation fails.
34. setReceiverVideoConstraint(resolution) - set the desired resolution to get from JVB (180, 360, 720, 1080, etc).
34. `setReceiverVideoConstraint(resolution)` - set the desired resolution to get from JVB (180, 360, 720, 1080, etc).
You should use that method if you are using simulcast.
35. isHidden - checks if local user has joined as a "hidden" user. This is a specialized role used for integrations.
35. `setSenderVideoConstraint(resolution)` - set the desired resolution to send to JVB or the peer (180, 360, 720).
36. `isHidden` - checks if local user has joined as a "hidden" user. This is a specialized role used for integrations.
37. `setLocalParticipantProperty(propertyKey, propertyValue)` - used to set a custom propery to the local participant("fullName": "Full Name", favoriteColor: "red", "userId": 234). Also this can be used to modify an already set custom property.
- `propertyKey` - string - custom property name
- `propertyValue` - string - custom property value
JitsiTrack

@@ -423,48 +470,48 @@ ======

1. getType() - returns string with the type of the track( "video" for the video tracks and "audio" for the audio tracks)
1. `getType()` - returns string with the type of the track( "video" for the video tracks and "audio" for the audio tracks)
2. mute() - mutes the track. Returns Promise.
2. `mute()` - mutes the track. Returns Promise.
Note: This method is implemented only for the local tracks.
3. unmute() - unmutes the track. Returns Promise.
3. `unmute()` - unmutes the track. Returns Promise.
Note: This method is implemented only for the local tracks.
4. isMuted() - check if track is muted
4. `isMuted()` - check if track is muted
5. attach(container) - attaches the track to the given container.
5. `attach(container)` - attaches the track to the given container.
6. detach(container) - removes the track from the container.
6. `detach(container)` - removes the track from the container.
7. dispose() - disposes the track. If the track is added to a conference the track will be removed. Returns Promise.
7. `dispose()` - disposes the track. If the track is added to a conference the track will be removed. Returns Promise.
Note: This method is implemented only for the local tracks.
8. getId() - returns unique string for the track.
8. `getId()` - returns unique string for the track.
9. getParticipantId() - returns id(string) of the track owner
9. `getParticipantId()` - returns id(string) of the track owner
Note: This method is implemented only for the remote tracks.
10. setAudioOutput(audioOutputDeviceId) - sets new audio output device for track's DOM elements. Video tracks are ignored.
10. `setAudioOutput(audioOutputDeviceId)` - sets new audio output device for track's DOM elements. Video tracks are ignored.
11. getDeviceId() - returns device ID associated with track (for local tracks only)
11. `getDeviceId()` - returns device ID associated with track (for local tracks only)
12. isEnded() - returns true if track is ended
12. `isEnded()` - returns true if track is ended
13. setEffect(effect) - Applies the effect by swapping out the existing MediaStream on the JitsiTrack with the new
13. `setEffect(effect)` - Applies the effect by swapping out the existing MediaStream on the JitsiTrack with the new
MediaStream which has the desired effect. "undefined" is passed to this function for removing the effect and for
restoring the original MediaStream on the JitsiTrack.
restoring the original MediaStream on the `JitsiTrack`.
The following methods have to be defined for the effect instance.
startEffect() - Starts the effect and returns a new MediaStream that is to be swapped with the existing one.
`startEffect()` - Starts the effect and returns a new MediaStream that is to be swapped with the existing one.
stopEffect() - Stops the effect.
`stopEffect()` - Stops the effect.
isEnabled() - Checks if the local track supports the effect.
`isEnabled()` - Checks if the local track supports the effect.

@@ -475,8 +522,8 @@ Note: This method is implemented only for the local tracks.

======
The object represents error that happened to a JitsiTrack. Is inherited from JavaScript base ```Error``` object,
so ```"name"```, ```"message"``` and ```"stack"``` properties are available. For GUM-related errors,
exposes additional ```"gum"``` property, which is an object with following properties:
- error - original GUM error
- constraints - GUM constraints object used for the call
- devices - array of devices requested in GUM call (possible values - "audio", "video", "screen", "desktop", "audiooutput")
The object represents error that happened to a JitsiTrack. Is inherited from JavaScript base `Error` object,
so `"name"`, `"message"` and `"stack"` properties are available. For GUM-related errors,
exposes additional `"gum"` property, which is an object with following properties:
- `error` - original GUM error
- `constraints` - GUM constraints object used for the call
- `devices` - array of devices requested in GUM call (possible values - "audio", "video", "screen", "desktop", "audiooutput")

@@ -486,3 +533,3 @@ Getting Started

1. The first thing you must do in order to use Jitsi Meet API is to initialize ```JitsiMeetJS``` object:
1. The first thing you must do in order to use Jitsi Meet API is to initialize `JitsiMeetJS` object:

@@ -511,3 +558,3 @@ ```javascript

4. After you receive the ```CONNECTION_ESTABLISHED``` event you are to create the ```JitsiConference``` object and
4. After you receive the `CONNECTION_ESTABLISHED` event you are to create the `JitsiConference` object and
also you may want to attach listeners for conference events (we are going to add handlers for remote track, conference joined, etc. ):

@@ -517,3 +564,2 @@

```javascript
room = connection.initJitsiConference("conference1", confOptions);

@@ -520,0 +566,0 @@ room.on(JitsiMeetJS.events.conference.TRACK_ADDED, onRemoteTrack);

@@ -241,19 +241,3 @@ /* global $, JitsiMeetJS */

const initOptions = {
disableAudioLevels: true,
// The ID of the jidesha extension for Chrome.
desktopSharingChromeExtId: 'mbocklcggfhnbahlnepmldehdhpjfcjp',
// Whether desktop sharing should be disabled on Chrome.
desktopSharingChromeDisabled: false,
// The media sources to use when using screen sharing with the Chrome
// extension.
desktopSharingChromeSources: [ 'screen', 'window' ],
// Required version of Chrome extension
desktopSharingChromeMinExtVersion: '0.1',
// Whether desktop sharing should be disabled on Firefox.
desktopSharingFirefoxDisabled: true
disableAudioLevels: true
};

@@ -260,0 +244,0 @@

@@ -21,3 +21,3 @@ JWT token authentication Prosody plugin

- 'exp' token expiration timestamp as defined in the RFC
- 'sub' contains the name of the domain used when authenticating with this token. By default assuming that we have full MUC 'conference1@muc.server.net' then 'server.net' should be used here.
- 'sub' contains EITHER the lowercase name of the tenant (for a conference like TENANT1/ROOM with would be 'tenant1') OR the lowercase name of the domain used when authenticating with this token (for a conference like /ROOM). By default assuming that we have full MUC 'conference1@muc.server.net' then 'server.net' should be used here. Alternately, a '*' may be provided, allowing access to rooms in all tenants within the domain or all domains within the server.
- 'aud' application identifier. This value indicates what service is consuming the token. It should be negotiated with the service provider before generating the token.

@@ -31,7 +31,8 @@

In addition to the basic claims used in authentication, the token can also provide user display information in the 'context' field within the JWT payload:
- 'group' is a string which specifies the group the user belongs to. Intended for use in reporting/analytics
In addition to the basic claims used in authentication, the token can also provide user display information in the 'context' field within the JWT payload. None of the information in the context field is used for token validation:
- 'group' is a string which specifies the group the user belongs to. Intended for use in reporting/analytics, not used for token validation.
- 'user' is an object which contains display information for the current user
- 'id' is a user identifier string. Intended for use in reporting/analytics
- 'name' is the display name of the user
- 'email' is the email of the user
- 'avatar' is the URL of the avatar for the user

@@ -87,3 +88,3 @@ - 'callee' is an optional object containing display information when launching a 1-1 video call with a single other participant. It used to display an overlay to the first user, before the second user joins.

- when user connects to Prosody through BOSH. Token value is passed as 'token' query paramater of BOSH URL. User uses XMPP anonymous authentication method.
- when MUC room is being created/joined Prosody compares 'room' claim with the actual name of the room. This prevents from abusing stolen token by unathorized users to allocate new conference rooms in the system. Admin users are not required to provide valid token which is used by Jicofo for example.
- when MUC room is being created/joined Prosody compares 'room' claim with the actual name of the room. In addition, the 'sub' claim is compare to either the tenant (for TENANT/ROOM URLs) or the base domain (for /ROOM URLs). This prevents stolen token being abused by unathorized users to allocate new conference rooms in the system. Admin users are not required to provide valid token which is used by Jicofo for example.

@@ -131,3 +132,3 @@ ### Lib-jitsi-meet options

JWT token authentication requires prosody-trunk version at least 747.
JWT token authentication requires prosody-trunk version at least 747. JWT tokens with websockets requires prosody 0.11.6 or higher.

@@ -134,0 +135,0 @@ You can download latest prosody-trunk packages from [here]. Then install it with the following command:

@@ -37,2 +37,14 @@ /**

/**
* Indicates that a connection error is due to not allowed,
* occurred when trying to join a conference, only approved members are allowed to join.
*/
export const MEMBERS_ONLY_ERROR = 'conference.connectionError.membersOnly';
/**
* Indicates that a connection error is due to denied access to the room,
* occurred after joining a lobby room and access is denied by the room moderators.
*/
export const CONFERENCE_ACCESS_DENIED = 'conference.connectionError.accessDenied';
/**
* Indicates that focus error happened.

@@ -53,2 +65,7 @@ */

/**
* Indicates that the media connection has failed.
*/
export const ICE_FAILED = 'conference.iceFailed';
/**
* Indicates that the versions of the server side components are incompatible

@@ -81,9 +98,4 @@ * with the client side.

/**
* Indicates that the conference setup failed.
*/
export const SETUP_FAILED = 'conference.setup_failed';
/**
* Indicates that there is no available videobridge.
*/
export const VIDEOBRIDGE_NOT_AVAILABLE = 'conference.videobridgeNotAvailable';
/* global __filename */
import { getLogger } from 'jitsi-meet-logger';
import { Strophe } from 'strophe.js';
import * as JitsiConferenceErrors from './JitsiConferenceErrors';
import * as JitsiConferenceEvents from './JitsiConferenceEvents';
import Statistics from './modules/statistics/statistics';
import EventEmitterForwarder from './modules/util/EventEmitterForwarder';
import * as MediaType from './service/RTC/MediaType';
import RTCEvents from './service/RTC/RTCEvents';
import VideoType from './service/RTC/VideoType';
import AuthenticationEvents
from './service/authentication/AuthenticationEvents';
import {

@@ -12,12 +23,2 @@ ACTION_JINGLE_SA_TIMEOUT,

} from './service/statistics/AnalyticsEvents';
import AuthenticationEvents
from './service/authentication/AuthenticationEvents';
import EventEmitterForwarder from './modules/util/EventEmitterForwarder';
import { getLogger } from 'jitsi-meet-logger';
import * as JitsiConferenceErrors from './JitsiConferenceErrors';
import * as JitsiConferenceEvents from './JitsiConferenceEvents';
import * as MediaType from './service/RTC/MediaType';
import RTCEvents from './service/RTC/RTCEvents';
import VideoType from './service/RTC/VideoType';
import Statistics from './modules/statistics/statistics';
import XMPPEvents from './service/xmpp/XMPPEvents';

@@ -163,2 +164,5 @@

JitsiConferenceErrors.NOT_ALLOWED_ERROR);
this.chatRoomForwarder.forward(XMPPEvents.ROOM_CONNECT_MEMBERS_ONLY_ERROR,
JitsiConferenceEvents.CONFERENCE_FAILED,
JitsiConferenceErrors.MEMBERS_ONLY_ERROR);

@@ -277,4 +281,13 @@ this.chatRoomForwarder.forward(XMPPEvents.ROOM_MAX_USERS_ERROR,

this.chatRoomForwarder.forward(XMPPEvents.MUC_MEMBERS_ONLY_CHANGED,
JitsiConferenceEvents.MEMBERS_ONLY_CHANGED);
chatRoom.addListener(XMPPEvents.MUC_MEMBER_JOINED,
conference.onMemberJoined.bind(conference));
this.chatRoomForwarder.forward(XMPPEvents.MUC_LOBBY_MEMBER_JOINED,
JitsiConferenceEvents.LOBBY_USER_JOINED);
this.chatRoomForwarder.forward(XMPPEvents.MUC_LOBBY_MEMBER_UPDATED,
JitsiConferenceEvents.LOBBY_USER_UPDATED);
this.chatRoomForwarder.forward(XMPPEvents.MUC_LOBBY_MEMBER_LEFT,
JitsiConferenceEvents.LOBBY_USER_LEFT);
chatRoom.addListener(XMPPEvents.MUC_MEMBER_BOT_TYPE_CHANGED,

@@ -286,2 +299,5 @@ conference._onMemberBotTypeChanged.bind(conference));

JitsiConferenceEvents.CONFERENCE_LEFT);
this.chatRoomForwarder.forward(XMPPEvents.MUC_DENIED_ACCESS,
JitsiConferenceEvents.CONFERENCE_FAILED,
JitsiConferenceErrors.CONFERENCE_ACCESS_DENIED);

@@ -288,0 +304,0 @@ chatRoom.addListener(XMPPEvents.DISPLAY_NAME_CHANGED,

@@ -17,7 +17,2 @@ /**

/**
* A participant avatar has changed.
*/
export const AVATAR_CHANGED = 'conference.avatarChanged';
/**
* Fired just before the statistics module is disposed and it's the last chance

@@ -153,2 +148,24 @@ * to submit some logs to the statistics service (ex. CallStats if enabled),

/**
* An event(library-private) fired when a new media session is added to the conference.
* @type {string}
* @private
*/
export const _MEDIA_SESSION_STARTED = 'conference.media_session.started';
/**
* An event(library-private) fired when the conference switches the currently active media session.
* @type {string}
* @private
*/
export const _MEDIA_SESSION_ACTIVE_CHANGED = 'conference.media_session.active_changed';
/**
* Indicates that the conference had changed to members only enabled/disabled.
* The first argument of this event is a <tt>boolean</tt> which when set to
* <tt>true</tt> means that the conference is running in members only mode.
* You may need to use Lobby if supported to ask for permissions to enter the conference.
*/
export const MEMBERS_ONLY_CHANGED = 'conference.membersOnlyChanged';
/**
* New text message was received.

@@ -333,1 +350,16 @@ */

export const BOT_TYPE_CHANGED = 'conference.bot_type_changed';
/**
* A new user joined the lobby room.
*/
export const LOBBY_USER_JOINED = 'conference.lobby.userJoined';
/**
* A user from the lobby room has been update.
*/
export const LOBBY_USER_UPDATED = 'conference.lobby.userUpdated';
/**
* A user left the lobby room.
*/
export const LOBBY_USER_LEFT = 'conference.lobby.userLeft';

@@ -0,1 +1,5 @@

import JitsiConference from './JitsiConference';
import * as JitsiConnectionEvents from './JitsiConnectionEvents';
import Statistics from './modules/statistics/statistics';
import XMPP from './modules/xmpp/xmpp';
import {

@@ -5,6 +9,2 @@ CONNECTION_DISCONNECTED as ANALYTICS_CONNECTION_DISCONNECTED,

} from './service/statistics/AnalyticsEvents';
import JitsiConference from './JitsiConference';
import * as JitsiConnectionEvents from './JitsiConnectionEvents';
import Statistics from './modules/statistics/statistics';
import XMPP from './modules/xmpp/xmpp';

@@ -167,1 +167,24 @@ /**

};
/**
* Get object with internal logs.
*/
JitsiConnection.prototype.getLogs = function() {
const data = this.xmpp.getJingleLog();
const metadata = {};
metadata.time = new Date();
metadata.url = window.location.href;
metadata.ua = navigator.userAgent;
const log = this.xmpp.getXmppLog();
if (log) {
metadata.xmpp = log;
}
data.metadata = metadata;
return data;
};

@@ -41,1 +41,8 @@ /**

export const WRONG_STATE = 'connection.wrongState';
/**
* Indicates that the display name is required over this connection and need to be supplied when
* joining the room.
* There are cases like lobby room where display name is required.
*/
export const DISPLAY_NAME_REQUIRED = 'connection.display_name_required';
import EventEmitter from 'events';
import * as JitsiMediaDevicesEvents from './JitsiMediaDevicesEvents';
import RTC from './modules/RTC/RTC';
import browser from './modules/browser';
import Statistics from './modules/statistics/statistics';
import * as MediaType from './service/RTC/MediaType';
import browser from './modules/browser';
import RTC from './modules/RTC/RTC';
import RTCEvents from './service/RTC/RTCEvents';
import Statistics from './modules/statistics/statistics';
import * as JitsiMediaDevicesEvents from './JitsiMediaDevicesEvents';
const AUDIO_PERMISSION_NAME = 'microphone';

@@ -139,2 +138,10 @@ const PERMISSION_GRANTED_STATUS = 'granted';

if (!supported) {
// Workaround on Safari for audio input device
// selection to work. Safari doesn't support the
// permissions query.
if (browser.isSafari()) {
resolve(true);
return;
}
resolve(false);

@@ -217,3 +224,3 @@

if (availableDevices && availableDevices.length > 0) {
if (availableDevices.length > 0) {
// if we have devices info report device to stats

@@ -220,0 +227,0 @@ // normally this will not happen on startup as this method is called

/* global __filename */
import getActiveAudioDevice from './modules/detection/ActiveDeviceDetector';
import * as DetectionEvents from './modules/detection/DetectionEvents';
import TrackVADEmitter from './modules/detection/TrackVADEmitter';
import { createGetUserMediaEvent } from './service/statistics/AnalyticsEvents';
import AuthUtil from './modules/util/AuthUtil';
import * as ConnectionQualityEvents
from './service/connectivity/ConnectionQualityEvents';
import * as E2ePingEvents from './service/e2eping/E2ePingEvents';
import GlobalOnErrorHandler from './modules/util/GlobalOnErrorHandler';
import Logger from 'jitsi-meet-logger';
import * as JitsiConferenceErrors from './JitsiConferenceErrors';

@@ -23,16 +16,28 @@ import * as JitsiConferenceEvents from './JitsiConferenceEvents';

import * as JitsiTranscriptionStatus from './JitsiTranscriptionStatus';
import LocalStatsCollector from './modules/statistics/LocalStatsCollector';
import Logger from 'jitsi-meet-logger';
import * as MediaType from './service/RTC/MediaType';
import Resolutions from './service/RTC/Resolutions';
import RTC from './modules/RTC/RTC';
import browser from './modules/browser';
import NetworkInfo from './modules/connectivity/NetworkInfo';
import { ParticipantConnectionStatus }
from './modules/connectivity/ParticipantConnectionStatus';
import RTC from './modules/RTC/RTC';
import browser from './modules/browser';
import ScriptUtil from './modules/util/ScriptUtil';
import recordingConstants from './modules/recording/recordingConstants';
import getActiveAudioDevice from './modules/detection/ActiveDeviceDetector';
import * as DetectionEvents from './modules/detection/DetectionEvents';
import TrackVADEmitter from './modules/detection/TrackVADEmitter';
import ProxyConnectionService
from './modules/proxyconnection/ProxyConnectionService';
import recordingConstants from './modules/recording/recordingConstants';
import Settings from './modules/settings/Settings';
import LocalStatsCollector from './modules/statistics/LocalStatsCollector';
import precallTest from './modules/statistics/PrecallTest';
import Statistics from './modules/statistics/statistics';
import AuthUtil from './modules/util/AuthUtil';
import GlobalOnErrorHandler from './modules/util/GlobalOnErrorHandler';
import ScriptUtil from './modules/util/ScriptUtil';
import * as VideoSIPGWConstants from './modules/videosipgw/VideoSIPGWConstants';
import AudioMixer from './modules/webaudio/AudioMixer';
import * as MediaType from './service/RTC/MediaType';
import Resolutions from './service/RTC/Resolutions';
import * as ConnectionQualityEvents
from './service/connectivity/ConnectionQualityEvents';
import * as E2ePingEvents from './service/e2eping/E2ePingEvents';
import { createGetUserMediaEvent } from './service/statistics/AnalyticsEvents';

@@ -175,2 +180,3 @@ const logger = Logger.getLogger(__filename);

init(options = {}) {
Settings.init(options.externalStorage);
Statistics.init(options);

@@ -303,6 +309,2 @@

* @param {string} options.micDeviceId
* @param {object} options.desktopSharingExtensionExternalInstallation -
* enables external installation process for desktop sharing extension if
* the inline installation is not posible. The following properties should
* be provided:
* @param {intiger} interval - the interval (in ms) for

@@ -450,3 +452,3 @@ * checking whether the desktop sharing extension is installed or not

if (error.name
=== JitsiTrackErrors.CHROME_EXTENSION_USER_CANCELED) {
=== JitsiTrackErrors.SCREENSHARING_USER_CANCELED) {
// User cancelled action is not really an error, so only

@@ -456,3 +458,3 @@ // log it as an event to avoid having conference classified

const logObject = {
id: 'chrome_extension_user_canceled',
id: 'screensharing_user_canceled',
message: error.message

@@ -525,2 +527,12 @@ };

/**
* Create AudioMixer, which is essentially a wrapper over web audio ChannelMergerNode. It essentially allows the
* user to mix multiple MediaStreams into a single one.
*
* @returns {AudioMixer}
*/
createAudioMixer() {
return new AudioMixer();
},
/**
* Go through all audio devices on the system and return one that is active, i.e. has audio signal.

@@ -617,2 +629,12 @@ *

/**
* Informs lib-jitsi-meet about the current network status.
*
* @param {boolean} isOnline - {@code true} if the internet connectivity is online or {@code false}
* otherwise.
*/
setNetworkInfo({ isOnline }) {
NetworkInfo.updateNetworkInfo({ isOnline });
},
/**
* Set the contentHint on the transmitted stream track to indicate

@@ -635,2 +657,4 @@ * charaterstics in the video stream, which informs PeerConnection

precallTest,
/* eslint-enable max-params */

@@ -637,0 +661,0 @@

import { getLogger } from 'jitsi-meet-logger';
import { Strophe } from 'strophe.js';
import { getLogger } from 'jitsi-meet-logger';

@@ -6,0 +6,0 @@ import * as JitsiConferenceEvents from './JitsiConferenceEvents';

@@ -7,12 +7,6 @@ import * as JitsiTrackErrors from './JitsiTrackErrors';

= 'Video resolution is not supported: ';
TRACK_ERROR_TO_MESSAGE_MAP[JitsiTrackErrors.CHROME_EXTENSION_INSTALLATION_ERROR]
= 'Failed to install Chrome extension';
TRACK_ERROR_TO_MESSAGE_MAP[
JitsiTrackErrors.CHROME_EXTENSION_USER_GESTURE_REQUIRED]
= 'Failed to install Chrome extension - installations can only be initiated'
+ ' by a user gesture.';
TRACK_ERROR_TO_MESSAGE_MAP[JitsiTrackErrors.CHROME_EXTENSION_USER_CANCELED]
= 'User canceled Chrome\'s screen sharing prompt';
TRACK_ERROR_TO_MESSAGE_MAP[JitsiTrackErrors.CHROME_EXTENSION_GENERIC_ERROR]
= 'Unknown error from Chrome extension';
TRACK_ERROR_TO_MESSAGE_MAP[JitsiTrackErrors.SCREENSHARING_USER_CANCELED]
= 'User canceled screen sharing prompt';
TRACK_ERROR_TO_MESSAGE_MAP[JitsiTrackErrors.SCREENSHARING_GENERIC_ERROR]
= 'Unknown error from screensharing';
TRACK_ERROR_TO_MESSAGE_MAP[JitsiTrackErrors.ELECTRON_DESKTOP_PICKER_ERROR]

@@ -19,0 +13,0 @@ = 'Unkown error from desktop picker';

@@ -6,32 +6,2 @@ /**

/**
* Generic error for jidesha extension for Chrome.
*/
export const CHROME_EXTENSION_GENERIC_ERROR
= 'gum.chrome_extension_generic_error';
/**
* An error which indicates that the jidesha extension for Chrome is
* failed to install.
*/
export const CHROME_EXTENSION_INSTALLATION_ERROR
= 'gum.chrome_extension_installation_error';
/**
* This error indicates that the attempt to start screensharing was initiated by
* a script which did not originate in user gesture handler. It means that
* you should to trigger the action again in response to a button click for
* example.
* @type {string}
*/
export const CHROME_EXTENSION_USER_GESTURE_REQUIRED
= 'gum.chrome_extension_user_gesture_required';
/**
* An error which indicates that user canceled screen sharing window
* selection dialog in jidesha extension for Chrome.
*/
export const CHROME_EXTENSION_USER_CANCELED
= 'gum.chrome_extension_user_canceled';
/**
* An error which indicates that some of requested constraints in

@@ -57,8 +27,2 @@ * getUserMedia call were not satisfied.

/**
* An error which indicates that the jidesha extension for Firefox is
* needed to proceed with screen sharing, and that it is not installed.
*/
export const FIREFOX_EXTENSION_NEEDED = 'gum.firefox_extension_needed';
/**
* Generic getUserMedia error.

@@ -80,2 +44,15 @@ */

/**
* Generic error for screensharing failure.
*/
export const SCREENSHARING_GENERIC_ERROR
= 'gum.screensharing_generic_error';
/**
* An error which indicates that user canceled screen sharing window
* selection dialog.
*/
export const SCREENSHARING_USER_CANCELED
= 'gum.screensharing_user_canceled';
/**
* An error which indicates that track has been already disposed and cannot

@@ -82,0 +59,0 @@ * be longer used.

@@ -40,1 +40,7 @@ /**

export const NO_DATA_FROM_SOURCE = 'track.no_data_from_source';
/**
* Indicates that the local audio track is not receiving any audio input from
* the microphone that is currently selected.
*/
export const NO_AUDIO_INPUT = 'track.no_audio_input';

@@ -17,4 +17,4 @@ // Karma configuration

files: [
'./doc/example/libs/jquery-2.1.1.js',
'node_modules/core-js/index.js',
'./index.js',
'./modules/**/*.spec.js'

@@ -32,4 +32,3 @@ ],

'node_modules/core-js/**': [ 'webpack' ],
'./index.js': [ 'webpack' ],
'./**/*.spec.js': [ 'webpack' ]
'./**/*.spec.js': [ 'webpack', 'sourcemap' ]
},

@@ -66,4 +65,4 @@

webpack: require('./webpack.config.js')
webpack: require('./webpack-shared-config')
});
};

@@ -0,3 +1,3 @@

import { BrowserDetection } from '@jitsi/js-utils';
import { getLogger } from 'jitsi-meet-logger';
import { BrowserDetection } from '@jitsi/js-utils';

@@ -33,3 +33,3 @@ const logger = getLogger(__filename);

doesVideoMuteByStreamRemove() {
return this.isChromiumBased();
return this.isChromiumBased() || this.isSafari();
}

@@ -43,3 +43,3 @@

supportsP2P() {
return !this.isFirefox();
return !this.usesUnifiedPlan();
}

@@ -67,24 +67,11 @@

/**
* Checks if current browser is a Safari and a version of Safari that
* supports native webrtc.
* Checks whether current running context is a Trusted Web Application.
*
* @returns {boolean}
* @returns {boolean} Whether the current context is a TWA.
*/
isSafariWithWebrtc() {
return this.isSafari()
&& !this.isVersionLessThan('11');
isTwa() {
return 'matchMedia' in window && window.matchMedia('(display-mode:standalone)').matches;
}
/**
* Checks if current browser is a Safari and a version of Safari that
* supports VP8.
*
* @returns {boolean}
*/
isSafariWithVP8() {
return this.isSafari()
&& !this.isVersionLessThan('12.1');
}
/**
* Checks if the current browser is supported.

@@ -98,3 +85,3 @@ *

|| this.isReactNative()
|| this.isSafariWithWebrtc();
|| (this.isSafari() && !this.isVersionLessThan('12.1'));
}

@@ -109,3 +96,3 @@

isUserInteractionRequiredForUnmute() {
return (this.isFirefox() && this.isVersionLessThan('68')) || this.isSafari();
return this.isFirefox() && this.isVersionLessThan('68');
}

@@ -120,4 +107,3 @@

supportsVideoMuteOnConnInterrupted() {
return this.isChromiumBased() || this.isReactNative()
|| this.isSafariWithVP8();
return this.isChromiumBased() || this.isReactNative() || this.isSafari();
}

@@ -133,6 +119,21 @@

// side, but not sure if not possible ?
return !this.isFirefox() && !this.isSafariWithWebrtc();
return !this.isFirefox() && !this.isSafari();
}
/**
* Checks if the current browser supports setting codec preferences on the transceiver.
* @returns {boolean}
*/
supportsCodecPreferences() {
return this.usesUnifiedPlan()
&& typeof window.RTCRtpTransceiver !== 'undefined'
&& Object.keys(window.RTCRtpTransceiver.prototype).indexOf('setCodecPreferences') > -1
&& Object.keys(RTCRtpSender.prototype).indexOf('getCapabilities') > -1
// this is not working on Safari because of the following bug
// https://bugs.webkit.org/show_bug.cgi?id=215567
&& !this.isSafari();
}
/**
* Checks if the current browser support the device change event.

@@ -152,7 +153,24 @@ * @return {boolean}

supportsLocalCandidateRttStatistics() {
return this.isChromiumBased() || this.isReactNative()
|| this.isSafariWithVP8();
return this.isChromiumBased() || this.isReactNative() || this.isSafari();
}
/**
* Checks if the current browser supports the Long Tasks API that lets us observe
* performance measurement events and be notified of tasks that take longer than
* 50ms to execute on the main thread.
*/
supportsPerformanceObserver() {
return typeof window.PerformanceObserver !== 'undefined'
&& PerformanceObserver.supportedEntryTypes.indexOf('longtask') > -1;
}
/**
* Checks if the current browser supports audio level stats on the receivers.
*/
supportsReceiverStats() {
return typeof window.RTCRtpReceiver !== 'undefined'
&& Object.keys(RTCRtpReceiver.prototype).indexOf('getSynchronizationSources') > -1;
}
/**
* Checks if the current browser reports round trip time statistics for

@@ -175,55 +193,30 @@ * the ICE candidate pair.

/**
* Checks whether the browser supports RTPSender.
* Returns whether or not the current browser can support capturing video,
* be it camera or desktop, and displaying received video.
*
* @returns {boolean}
*/
supportsRtpSender() {
return this.isFirefox() || this.isSafariWithVP8();
supportsVideo() {
return true;
}
/**
* Checks whether the browser supports RTX.
* Checks if the browser uses plan B.
*
* @returns {boolean}
*/
supportsRtx() {
return !this.isFirefox() && !this.usesUnifiedPlan();
usesPlanB() {
return !this.usesUnifiedPlan();
}
/**
* Whether jitsi-meet supports simulcast on the current browser.
* @returns {boolean}
*/
supportsSimulcast() {
return this.isChromiumBased() || this.isFirefox()
|| this.isSafariWithVP8() || this.isReactNative();
}
/**
* Returns whether or not the current browser can support capturing video,
* be it camera or desktop, and displaying received video.
* Checks if the browser uses SDP munging for turning on simulcast.
*
* @returns {boolean}
*/
supportsVideo() {
// FIXME: Check if we can use supportsVideoOut and supportsVideoIn. I
// leave the old implementation here in order not to brake something.
// Older versions of Safari using webrtc/adapter do not support video
// due in part to Safari only supporting H264 and the bridge sending VP8
// Newer Safari support VP8 and other WebRTC features.
return !this.isSafariWithWebrtc()
|| (this.isSafariWithVP8() && this.usesPlanB());
usesSdpMungingForSimulcast() {
return this.isChromiumBased() || this.isReactNative() || this.isSafari();
}
/**
* Checks if the browser uses plan B.
*
* @returns {boolean}
*/
usesPlanB() {
return !this.usesUnifiedPlan();
}
/**
* Checks if the browser uses unified plan.

@@ -238,3 +231,3 @@ *

if (this.isSafariWithVP8() && typeof window.RTCRtpTransceiver !== 'undefined') {
if (this.isSafari() && typeof window.RTCRtpTransceiver !== 'undefined') {
// eslint-disable-next-line max-len

@@ -265,3 +258,3 @@ // https://trac.webkit.org/changeset/236144/webkit/trunk/LayoutTests/webrtc/video-addLegacyTransceiver.html

if (this.isFirefox() || this.isSafariWithWebrtc()) {
if (this.isFirefox() || this.isSafari()) {
return true;

@@ -288,2 +281,10 @@ }

/**
* Checks if the browser uses RIDs/MIDs for siganling the simulcast streams
* to the bridge instead of the ssrcs.
*/
usesRidsForSimulcast() {
return false;
}
/**
* Checks if the browser supports getDisplayMedia.

@@ -300,2 +301,38 @@ * @returns {boolean} {@code true} if the browser supports getDisplayMedia.

/**
* Checks if the browser supports insertable streams, needed for E2EE.
* @returns {boolean} {@code true} if the browser supports insertable streams.
*/
supportsInsertableStreams() {
if (!(typeof window.RTCRtpSender !== 'undefined'
&& (window.RTCRtpSender.prototype.createEncodedStreams
|| window.RTCRtpSender.prototype.createEncodedVideoStreams))) {
return false;
}
// Feature-detect transferable streams which we need to operate in a worker.
// See https://groups.google.com/a/chromium.org/g/blink-dev/c/1LStSgBt6AM/m/hj0odB8pCAAJ
const stream = new ReadableStream();
try {
window.postMessage(stream, '*', [ stream ]);
return true;
} catch {
return false;
}
}
/**
* Whether the browser supports the RED format for audio.
*/
supportsAudioRed() {
return Boolean(window.RTCRtpSender
&& window.RTCRtpSender.getCapabilities
&& window.RTCRtpSender.getCapabilities('audio').codecs.some(codec => codec.mimeType === 'audio/red')
&& window.RTCRtpReceiver
&& window.RTCRtpReceiver.getCapabilities
&& window.RTCRtpReceiver.getCapabilities('audio').codecs.some(codec => codec.mimeType === 'audio/red'));
}
/**
* Checks if the browser supports the "sdpSemantics" configuration option.

@@ -302,0 +339,0 @@ * https://webrtc.org/web-apis/chrome/unified-plan/

@@ -1,9 +0,10 @@

import * as ConnectionQualityEvents
from '../../service/connectivity/ConnectionQualityEvents';
import * as ConferenceEvents from '../../JitsiConferenceEvents';
import { getLogger } from 'jitsi-meet-logger';
import * as ConferenceEvents from '../../JitsiConferenceEvents';
import * as RTCEvents from '../../service/RTC/RTCEvents';
import * as ConnectionQualityEvents from '../../service/connectivity/ConnectionQualityEvents';
const Resolutions = require('../../service/RTC/Resolutions');
const VideoType = require('../../service/RTC/VideoType');
const XMPPEvents = require('../../service/xmpp/XMPPEvents');
const VideoType = require('../../service/RTC/VideoType');
const Resolutions = require('../../service/RTC/Resolutions');

@@ -281,2 +282,7 @@ const logger = getLogger(__filename);

});
conference.rtc.on(
RTCEvents.LOCAL_TRACK_MAX_ENABLED_RESOLUTION_CHANGED,
track => {
this._localStats.maxEnabledResolution = track.maxEnabledResolution;
});

@@ -458,2 +464,3 @@ conference.on(

serverRegion: this._localStats.serverRegion,
maxEnabledResolution: this._localStats.maxEnabledResolution,
avgAudioLevels: this._localStats.localAvgAudioLevels

@@ -509,3 +516,4 @@ };

const isMuted = localVideoTrack ? localVideoTrack.isMuted() : true;
const resolution = localVideoTrack ? localVideoTrack.resolution : null;
const resolution = localVideoTrack
? Math.min(localVideoTrack.resolution, localVideoTrack.maxEnabledResolution) : null;

@@ -551,2 +559,3 @@ if (!isMuted) {

serverRegion: data.serverRegion,
maxEnabledResolution: data.maxEnabledResolution,
avgAudioLevels: data.avgAudioLevels

@@ -553,0 +562,0 @@ };

/* global __filename */
import { getLogger } from 'jitsi-meet-logger';
import * as JitsiConferenceEvents from '../../JitsiConferenceEvents';
import * as JitsiTrackEvents from '../../JitsiTrackEvents';
import * as MediaType from '../../service/RTC/MediaType';
import RTCEvents from '../../service/RTC/RTCEvents';
import { createParticipantConnectionStatusEvent } from '../../service/statistics/AnalyticsEvents';
import browser from '../browser';
import RTCEvents from '../../service/RTC/RTCEvents';
import Statistics from '../statistics/statistics';
import { createParticipantConnectionStatusEvent } from '../../service/statistics/AnalyticsEvents';

@@ -11,0 +12,0 @@ const logger = getLogger(__filename);

import EventEmitter from 'events';
import * as JitsiConferenceEvents from '../../JitsiConferenceEvents';
import * as JitsiTrackEvents from '../../JitsiTrackEvents';
import browser from '../browser';

@@ -33,3 +35,5 @@ import * as DetectionEvents from './DetectionEvents';

conference.statistics.addAudioLevelListener(this._audioLevel.bind(this));
if (!browser.supportsReceiverStats()) {
conference.statistics.addAudioLevelListener(this._audioLevel.bind(this));
}
conference.on(JitsiConferenceEvents.TRACK_ADDED, this._trackAdded.bind(this));

@@ -62,3 +66,2 @@ }

this._hasAudioInput = status;
this.emit(DetectionEvents.AUDIO_INPUT_STATE_CHANGE, this._hasAudioInput);

@@ -108,3 +111,2 @@ }

// Only target the current active track in the tpc. For some reason audio levels for previous

@@ -120,3 +122,2 @@ // devices are also picked up from the PeerConnection so we filter them out.

this._handleNoAudioInputDetection(audioLevel);
}

@@ -136,4 +137,21 @@

this._clearTriggerTimeout();
// Listen for the audio levels on the newly added audio track
if (browser.supportsReceiverStats()) {
track.on(
JitsiTrackEvents.NO_AUDIO_INPUT,
audioLevel => {
this._handleNoAudioInputDetection(audioLevel);
}
);
track.on(
JitsiTrackEvents.TRACK_AUDIO_LEVEL_CHANGED,
audioLevel => {
this._handleNoAudioInputDetection(audioLevel);
this._handleAudioInputStateChange(audioLevel);
}
);
}
}
}
}
import EventEmitter from 'events';
import RTC from '../RTC/RTC';
import { createAudioContext } from '../webaudio/WebAudioUtils';
import { createAudioContext } from './webaudio/WebAudioUtils';
import { VAD_SCORE_PUBLISHED } from './DetectionEvents';

@@ -7,0 +7,0 @@

/* global __filename */
import { getLogger } from 'jitsi-meet-logger';
import { createE2eRttEvent } from '../../service/statistics/AnalyticsEvents';
import * as JitsiConferenceEvents from '../../JitsiConferenceEvents';
import * as E2ePingEvents
from '../../service/e2eping/E2ePingEvents';
import * as JitsiConferenceEvents from '../../JitsiConferenceEvents';
import { createE2eRttEvent } from '../../service/statistics/AnalyticsEvents';
import Statistics from '../statistics/statistics';

@@ -8,0 +9,0 @@

/* global __filename */
import { getLogger } from 'jitsi-meet-logger';
import * as JitsiConferenceEvents from '../../JitsiConferenceEvents';

@@ -5,0 +6,0 @@

import { getLogger } from 'jitsi-meet-logger';
import RTC from '../RTC/RTC';
import RTCEvents from '../../service/RTC/RTCEvents';
import XMPPEvents from '../../service/xmpp/XMPPEvents';
import RTC from '../RTC/RTC';
import JingleSessionPC from '../xmpp/JingleSessionPC';

@@ -216,4 +215,4 @@ import { DEFAULT_STUN_SERVERS } from '../xmpp/xmpp';

/**
* {@code JingleSessionPC} expects an instance of
* {@code JitsiConference} to be passed in. {@code ProxyConnectionPC}
* {@link JingleSessionPC} expects an instance of
* {@link ChatRoom} to be passed in. {@link ProxyConnectionPC}
* is instantiated outside of the {@code JitsiConference}, so it must be

@@ -237,2 +236,8 @@ * stubbed to prevent errors.

/**
* A {@code JitsiConference} stub passed to the {@link RTC} module.
* @type {Object}
*/
const conferenceStub = {};
/**
* Create an instance of {@code RTC} as it is required for peer

@@ -243,3 +248,3 @@ * connection creation by {@code JingleSessionPC}. An existing instance

*/
this._rtc = new RTC(this, {});
this._rtc = new RTC(conferenceStub, {});

@@ -246,0 +251,0 @@ /**

@@ -23,11 +23,10 @@ import { getLogger } from 'jitsi-meet-logger';

* @param {string} [wsUrl] WebSocket URL.
* @param {EventEmitter} eventEmitter EventEmitter instance.
* @param {EventEmitter} emitter the EventEmitter instance to use for event emission.
* @param {function} senderVideoConstraintsChanged callback to call when the sender video constraints change.
*/
constructor(peerconnection, wsUrl, emitter) {
constructor(peerconnection, wsUrl, emitter, senderVideoConstraintsChanged) {
if (!peerconnection && !wsUrl) {
throw new TypeError(
'At least peerconnection or wsUrl must be given');
throw new TypeError('At least peerconnection or wsUrl must be given');
} else if (peerconnection && wsUrl) {
throw new TypeError(
'Just one of peerconnection or wsUrl must be given');
throw new TypeError('Just one of peerconnection or wsUrl must be given');
}

@@ -58,2 +57,4 @@

this._senderVideoConstraintsChanged = senderVideoConstraintsChanged;
// If a RTCPeerConnection is given, listen for new RTCDataChannel

@@ -203,9 +204,8 @@ // event.

sendSetLastNMessage(value) {
const jsonObject = {
logger.log(`Sending lastN=${value}.`);
this._send({
colibriClass: 'LastNChangedEvent',
lastN: value
};
this._send(jsonObject);
logger.log(`Channel lastN set to: ${value}`);
});
}

@@ -221,5 +221,3 @@

sendPinnedEndpointMessage(endpointId) {
logger.log(
'sending pinned changed notification to the bridge for endpoint ',
endpointId);
logger.log(`Sending pinned endpoint: ${endpointId}.`);

@@ -241,5 +239,3 @@ this._send({

sendSelectedEndpointsMessage(endpointIds) {
logger.log(
'sending selected changed notification to the bridge for endpoints',
endpointIds);
logger.log(`Sending selected endpoints: ${endpointIds}.`);

@@ -258,4 +254,3 @@ this._send({

sendReceiverVideoConstraintMessage(maxFrameHeightPixels) {
logger.log('sending a ReceiverVideoConstraint message with '
+ `a maxFrameHeight of ${maxFrameHeightPixels} pixels`);
logger.log(`Sending ReceiverVideoConstraint with maxFrameHeight=${maxFrameHeightPixels}px`);
this._send({

@@ -301,5 +296,3 @@ colibriClass: 'ReceiverVideoConstraint',

GlobalOnErrorHandler.callErrorHandler(error);
logger.error(
'Failed to parse channel message as JSON: ',
data, error);
logger.error('Failed to parse channel message as JSON: ', data, error);

@@ -316,8 +309,4 @@ return;

logger.info(
'Channel new dominant speaker event: ',
dominantSpeakerEndpoint);
emitter.emit(
RTCEvents.DOMINANT_SPEAKER_CHANGED,
dominantSpeakerEndpoint);
logger.info(`New dominant speaker: ${dominantSpeakerEndpoint}.`);
emitter.emit(RTCEvents.DOMINANT_SPEAKER_CHANGED, dominantSpeakerEndpoint);
break;

@@ -329,7 +318,4 @@ }

logger.info(
`Endpoint connection status changed: ${endpoint} active ? ${
isActive}`);
emitter.emit(RTCEvents.ENDPOINT_CONN_STATUS_CHANGED,
endpoint, isActive);
logger.info(`Endpoint connection status changed: ${endpoint} active=${isActive}`);
emitter.emit(RTCEvents.ENDPOINT_CONN_STATUS_CHANGED, endpoint, isActive);

@@ -339,5 +325,3 @@ break;

case 'EndpointMessage': {
emitter.emit(
RTCEvents.ENDPOINT_MESSAGE_RECEIVED, obj.from,
obj.msgPayload);
emitter.emit(RTCEvents.ENDPOINT_MESSAGE_RECEIVED, obj.from, obj.msgPayload);

@@ -347,17 +331,17 @@ break;

case 'LastNEndpointsChangeEvent': {
// The new/latest list of last-n endpoint IDs.
// The new/latest list of last-n endpoint IDs (i.e. endpoints for which the bridge is sending video).
const lastNEndpoints = obj.lastNEndpoints;
logger.info('Channel new last-n event: ',
lastNEndpoints, obj);
emitter.emit(RTCEvents.LASTN_ENDPOINT_CHANGED,
lastNEndpoints, obj);
logger.info(`New forwarded endpoints: ${lastNEndpoints}`);
emitter.emit(RTCEvents.LASTN_ENDPOINT_CHANGED, lastNEndpoints);
break;
}
case 'SelectedUpdateEvent': {
const isSelected = obj.isSelected;
case 'SenderVideoConstraints': {
const videoConstraints = obj.videoConstraints;
logger.info(`SelectedUpdateEvent isSelected? ${isSelected}`);
emitter.emit(RTCEvents.IS_SELECTED_CHANGED, isSelected);
if (videoConstraints) {
logger.info(`SenderVideoConstraints: ${JSON.stringify(videoConstraints)}`);
this._senderVideoConstraintsChanged(videoConstraints);
}
break;

@@ -364,0 +348,0 @@ }

/* global __filename, Promise */
import { getLogger } from 'jitsi-meet-logger';
import JitsiTrack from './JitsiTrack';
import JitsiTrackError from '../../JitsiTrackError';

@@ -15,4 +15,2 @@ import {

} from '../../JitsiTrackEvents';
import browser from '../browser';
import RTCUtils from './RTCUtils';
import CameraFacingMode from '../../service/RTC/CameraFacingMode';

@@ -27,4 +25,8 @@ import * as MediaType from '../../service/RTC/MediaType';

} from '../../service/statistics/AnalyticsEvents';
import browser from '../browser';
import Statistics from '../statistics/statistics';
import JitsiTrack from './JitsiTrack';
import RTCUtils from './RTCUtils';
const logger = getLogger(__filename);

@@ -97,2 +99,3 @@

this.resolution = track.getSettings().height;
this.maxEnabledResolution = resolution;

@@ -102,2 +105,10 @@ // Cache the constraints of the track in case of any this track

this._constraints = track.getConstraints();
// Safari returns an empty constraints object, construct the constraints using getSettings.
if (!Object.keys(this._constraints).length && videoType === VideoType.CAMERA) {
this._constraints = {
height: track.getSettings().height,
width: track.getSettings().width
};
}
} else {

@@ -108,2 +119,3 @@ // FIXME Currently, Firefox is ignoring our constraints about

this.resolution = browser.isFirefox() ? null : resolution;
this.maxEnabledResolution = this.resolution;
}

@@ -334,2 +346,3 @@

this._setStream(this._streamEffect.startEffect(this._originalStream));
this.track = this.stream.getTracks()[0];
}

@@ -347,2 +360,4 @@

this._setStream(this._originalStream);
this._originalStream = null;
this.track = this.stream.getTracks()[0];
}

@@ -385,3 +400,5 @@ }

if (this.isMuted()) {
// In case we have an audio track that is being enhanced with an effect, we still want it to be applied,
// even if the track is muted. Where as for video the actual track doesn't exists if it's muted.
if (this.isMuted() && !this.isAudioTrack()) {
this._streamEffect = effect;

@@ -402,7 +419,9 @@

// For firefox/safari, replace the stream without doing a offer answer with the remote peer.
if (browser.supportsRtpSender()) {
if (browser.usesUnifiedPlan()) {
this._switchStreamEffect(effect);
if (this.isVideoTrack()) {
this.containers.forEach(cont => RTCUtils.attachMediaStream(cont, this.stream));
}
return conference.replaceTrackWithoutOfferAnswer(this)
return conference.replaceTrack(this, this)
.then(() => {

@@ -551,3 +570,3 @@ this._setEffectInProgress = false;

promise.then(streamsInfo => {
promise = promise.then(streamsInfo => {
// The track kind for presenter track is video as well.

@@ -554,0 +573,0 @@ const mediaType = this.getType() === MediaType.PRESENTER ? MediaType.VIDEO : this.getType();

@@ -0,7 +1,9 @@

import * as JitsiTrackEvents from '../../JitsiTrackEvents';
import { createTtfmEvent } from '../../service/statistics/AnalyticsEvents';
import JitsiTrack from './JitsiTrack';
import * as JitsiTrackEvents from '../../JitsiTrackEvents';
import Statistics from '../statistics/statistics';
import JitsiTrack from './JitsiTrack';
const logger = require('jitsi-meet-logger').getLogger(__filename);
const RTCEvents = require('../../service/RTC/RTCEvents');

@@ -8,0 +10,0 @@

/* global __filename, module */
import EventEmitter from 'events';
import { getLogger } from 'jitsi-meet-logger';
import * as JitsiTrackEvents from '../../JitsiTrackEvents';
import * as MediaType from '../../service/RTC/MediaType';
import browser from '../browser';
import RTCUtils from './RTCUtils';

@@ -20,11 +23,2 @@

/**
* Adds onended/oninactive handler to a MediaStream.
* @param mediaStream a MediaStream to attach onended/oninactive handler
* @param handler the handler
*/
function addMediaStreamInactiveHandler(mediaStream, handler) {
mediaStream.oninactive = handler;
}
/**
* Represents a single media track (either audio or video).

@@ -95,2 +89,16 @@ */

/**
* Adds onended/oninactive handler to a MediaStream or a MediaStreamTrack.
* Firefox doesn't fire a inactive event on the MediaStream, instead it fires
* a onended event on the MediaStreamTrack.
* @param {Function} handler the handler
*/
_addMediaStreamInactiveHandler(handler) {
if (browser.isFirefox()) {
this.track.onended = handler;
} else {
this.stream.oninactive = handler;
}
}
/**
* Sets handler to the WebRTC MediaStream or MediaStreamTrack object

@@ -139,3 +147,3 @@ * depending on the passed type.

if (this._streamInactiveHandler) {
addMediaStreamInactiveHandler(this.stream, undefined);
this._addMediaStreamInactiveHandler(undefined);
}

@@ -167,4 +175,3 @@ }

if (this._streamInactiveHandler) {
addMediaStreamInactiveHandler(
this.stream, this._streamInactiveHandler);
this._addMediaStreamInactiveHandler(this._streamInactiveHandler);
}

@@ -424,8 +431,28 @@ }

setAudioLevel(audioLevel, tpc) {
if (this.audioLevel !== audioLevel) {
this.audioLevel = audioLevel;
let newAudioLevel = audioLevel;
// When using getSynchornizationSources on the audio receiver to gather audio levels for
// remote tracks, browser reports last known audio levels even when the remote user is
// audio muted, we need to reset the value to zero here so that the audio levels are cleared.
// Remote tracks have the tpc info present while local tracks do not.
if (browser.supportsReceiverStats() && typeof tpc !== 'undefined' && this.isMuted()) {
newAudioLevel = 0;
}
if (this.audioLevel !== newAudioLevel) {
this.audioLevel = newAudioLevel;
this.emit(
JitsiTrackEvents.TRACK_AUDIO_LEVEL_CHANGED,
audioLevel,
newAudioLevel,
tpc);
// LocalStatsCollector reports a value of 0.008 for muted mics
// and a value of 0 when there is no audio input.
} else if (this.audioLevel === 0
&& newAudioLevel === 0
&& this.isLocal()
&& !this.isWebRTCTrackMuted()) {
this.emit(
JitsiTrackEvents.NO_AUDIO_INPUT,
newAudioLevel);
}

@@ -432,0 +459,0 @@ }

/* global __filename */
import { getLogger } from 'jitsi-meet-logger';
import * as MediaType from '../../service/RTC/MediaType';

@@ -5,0 +6,0 @@ import { SdpTransformWrap } from '../xmpp/SdpTransformUtil';

@@ -5,16 +5,18 @@ /* global __filename */

import BridgeChannel from './BridgeChannel';
import * as JitsiConferenceEvents from '../../JitsiConferenceEvents';
import * as MediaType from '../../service/RTC/MediaType';
import RTCEvents from '../../service/RTC/RTCEvents';
import VideoType from '../../service/RTC/VideoType';
import browser from '../browser';
import Statistics from '../statistics/statistics';
import GlobalOnErrorHandler from '../util/GlobalOnErrorHandler';
import * as JitsiConferenceEvents from '../../JitsiConferenceEvents';
import JitsiLocalTrack from './JitsiLocalTrack';
import Listenable from '../util/Listenable';
import { safeCounterIncrement } from '../util/MathUtil';
import * as MediaType from '../../service/RTC/MediaType';
import browser from '../browser';
import RTCEvents from '../../service/RTC/RTCEvents';
import BridgeChannel from './BridgeChannel';
import JitsiLocalTrack from './JitsiLocalTrack';
import RTCUtils from './RTCUtils';
import Statistics from '../statistics/statistics';
import TraceablePeerConnection from './TraceablePeerConnection';
import VideoType from '../../service/RTC/VideoType';
const logger = getLogger(__filename);

@@ -141,10 +143,2 @@

// A flag whether we had received that the channel had opened we can
// get this flag out of sync if for some reason channel got closed
// from server, a desired behaviour so we can see errors when this
// happen.
// @private
// @type {boolean}
this._channelOpen = false;
/**

@@ -168,2 +162,7 @@ * The value specified to the last invocation of setLastN before the

/*
* Holds the sender video constraints signaled from the bridge.
*/
this._senderVideoConstraints = {};
/**

@@ -290,8 +289,5 @@ * The number representing the maximum video height the local client

this._channel = new BridgeChannel(
peerconnection, wsUrl, this.eventEmitter);
peerconnection, wsUrl, this.eventEmitter, this._senderVideoConstraintsChanged.bind(this));
this._channelOpenListener = () => {
// Mark that channel as opened.
this._channelOpen = true;
// When the channel becomes available, tell the bridge about

@@ -354,2 +350,14 @@ // video selections so that it can do adaptive simulcast,

/**
* Notifies this instance that the sender video constraints signaled from the bridge have changed.
*
* @param {Object} senderVideoConstraints the sender video constraints from the bridge.
* @private
*/
_senderVideoConstraintsChanged(senderVideoConstraints) {
logger.info(`Received remote max frame height of ${senderVideoConstraints} on the bridge channel`);
this._senderVideoConstraints = senderVideoConstraints;
this.eventEmitter.emit(RTCEvents.SENDER_VIDEO_CONSTRAINTS_CHANGED);
}
/**
* Receives events when Last N had changed.

@@ -395,3 +403,2 @@ * @param {array} lastNEndpoints The new Last N endpoints.

this._channel = null;
this._channelOpen = false;
}

@@ -412,3 +419,3 @@ }

if (this._channel && this._channelOpen) {
if (this._channel && this._channel.isOpen()) {
this._channel.sendReceiverVideoConstraintMessage(maxFrameHeight);

@@ -432,3 +439,3 @@ }

if (this._channel && this._channelOpen) {
if (this._channel && this._channel.isOpen()) {
this._channel.sendSelectedEndpointsMessage(ids);

@@ -449,3 +456,3 @@ }

this._pinnedEndpoint = id;
if (this._channel && this._channelOpen) {
if (this._channel && this._channel.isOpen()) {
this._channel.sendPinnedEndpointMessage(id);

@@ -495,2 +502,4 @@ }

* @param {object} options The config options.
* @param {boolean} options.enableInsertableStreams - Set to true when the insertable streams constraints is to be
* enabled on the PeerConnection.
* @param {boolean} options.disableSimulcast If set to 'true' will disable

@@ -518,2 +527,10 @@ * the simulcast.

// FIXME: We should rename iceConfig to pcConfig.
if (options.enableInsertableStreams) {
logger.debug('E2EE - setting insertable streams constraints');
iceConfig.encodedInsertableStreams = true;
iceConfig.forceEncodedAudioInsertableStreams = true; // legacy, to be removed in M88.
iceConfig.forceEncodedVideoInsertableStreams = true; // legacy, to be removed in M88.
}
if (browser.supportsSdpSemantics()) {

@@ -590,2 +607,9 @@ iceConfig.sdpSemantics = 'plan-b';

/**
* @return {Object} The sender video constraints signaled from the brridge.
*/
getSenderVideoConstraints() {
return this._senderVideoConstraints;
}
/**
* Get local video track.

@@ -876,3 +900,3 @@ * @returns {JitsiLocalTrack|undefined}

this._channel.close();
this._channelOpen = false;
this._channel = null;

@@ -936,3 +960,3 @@ this.removeListener(RTCEvents.LASTN_ENDPOINT_CHANGED,

this._lastN = value;
if (this._channel && this._channelOpen) {
if (this._channel && this._channel.isOpen()) {
this._channel.sendSetLastNMessage(value);

@@ -939,0 +963,0 @@ }

@@ -9,18 +9,21 @@ /* global

import { AVAILABLE_DEVICE } from '../../service/statistics/AnalyticsEvents';
import CameraFacingMode from '../../service/RTC/CameraFacingMode';
import EventEmitter from 'events';
import { getLogger } from 'jitsi-meet-logger';
import GlobalOnErrorHandler from '../util/GlobalOnErrorHandler';
import clonedeep from 'lodash.clonedeep';
import JitsiTrackError from '../../JitsiTrackError';
import Listenable from '../util/Listenable';
import CameraFacingMode from '../../service/RTC/CameraFacingMode';
import * as MediaType from '../../service/RTC/MediaType';
import RTCEvents from '../../service/RTC/RTCEvents';
import Resolutions from '../../service/RTC/Resolutions';
import VideoType from '../../service/RTC/VideoType';
import { AVAILABLE_DEVICE } from '../../service/statistics/AnalyticsEvents';
import browser from '../browser';
import RTCEvents from '../../service/RTC/RTCEvents';
import screenObtainer from './ScreenObtainer';
import Statistics from '../statistics/statistics';
import GlobalOnErrorHandler from '../util/GlobalOnErrorHandler';
import Listenable from '../util/Listenable';
import SDPUtil from '../xmpp/SDPUtil';
import Statistics from '../statistics/statistics';
import VideoType from '../../service/RTC/VideoType';
import screenObtainer from './ScreenObtainer';
const logger = getLogger(__filename);

@@ -59,3 +62,2 @@

video: {
aspectRatio: 16 / 9,
height: {

@@ -103,2 +105,9 @@ ideal: 720,

/**
* An empty function.
*/
function emptyFuncton() {
// no-op
}
/**
* Initialize wrapper function for enumerating devices.

@@ -113,3 +122,9 @@ * TODO: remove this, it should no longer be needed.

navigator.mediaDevices.enumerateDevices()
.then(callback, () => callback([]));
.then(devices => {
updateKnownDevices(devices);
callback(devices);
}, () => {
updateKnownDevices([]);
callback([]);
});
};

@@ -169,2 +184,4 @@ }

* @param {Object} options.frameRate.max - Maximum fps
* @param {bool} options.screenShareAudio - Used by electron clients to
* enable system audio screen sharing.
*/

@@ -186,3 +203,3 @@ function getConstraints(um, options = {}) {

= browser.isFirefox()
|| browser.isSafariWithVP8()
|| browser.isSafari()
|| browser.isReactNative();

@@ -317,2 +334,19 @@

};
// Audio screen sharing for electron only works for screen type devices.
// i.e. when the user shares the whole desktop.
if (browser.isElectron() && options.screenShareAudio
&& (options.desktopStream.indexOf('screen') >= 0)) {
// Provide constraints as described by the electron desktop capturer
// documentation here:
// https://www.electronjs.org/docs/api/desktop-capturer
// Note. The documentation specifies that chromeMediaSourceId should not be present
// which, in the case a users has multiple monitors, leads to them being shared all
// at once. However we tested with chromeMediaSourceId present and it seems to be
// working properly and also takes care of the previously mentioned issue.
constraints.audio = { mandatory: {
chromeMediaSource: constraints.video.mandatory.chromeMediaSource
} };
}
}

@@ -368,4 +402,3 @@

// the passed in constraints object.
const constraints = JSON.parse(JSON.stringify(
options.constraints || DEFAULT_CONSTRAINTS));
const constraints = clonedeep(options.constraints || DEFAULT_CONSTRAINTS);

@@ -377,2 +410,18 @@ if (um.indexOf('video') >= 0) {

// Override the constraints on Safari because of the following webkit bug.
// https://bugs.webkit.org/show_bug.cgi?id=210932
// Camera doesn't start on older macOS versions if min/max constraints are specified.
// TODO: remove this hack when the bug fix is available on Mojave, Sierra and High Sierra.
if (browser.isSafari()) {
if (constraints.video.height && constraints.video.height.ideal) {
constraints.video.height = { ideal: clonedeep(constraints.video.height.ideal) };
} else {
logger.warn('Ideal camera height missing, camera may not start properly');
}
if (constraints.video.width && constraints.video.width.ideal) {
constraints.video.width = { ideal: clonedeep(constraints.video.width.ideal) };
} else {
logger.warn('Ideal camera width missing, camera may not start properly');
}
}
if (options.cameraDeviceId) {

@@ -394,23 +443,32 @@ constraints.video.deviceId = options.cameraDeviceId;

// NOTE(brian): the new-style ('advanced' instead of 'optional')
// doesn't seem to carry through the googXXX constraints
// Changing back to 'optional' here (even with video using
// the 'advanced' style) allows them to be passed through
// but also requires the device id to capture to be set in optional
// as sourceId otherwise the constraints are considered malformed.
if (!constraints.audio.optional) {
constraints.audio.optional = [];
// Use the standard audio constraints on non-chromium browsers.
if (browser.isFirefox() || browser.isSafari()) {
constraints.audio = {
deviceId: options.micDeviceId,
autoGainControl: !disableAGC && !disableAP,
echoCancellation: !disableAEC && !disableAP,
noiseSuppression: !disableNS && !disableAP
};
} else {
// NOTE(brian): the new-style ('advanced' instead of 'optional')
// doesn't seem to carry through the googXXX constraints
// Changing back to 'optional' here (even with video using
// the 'advanced' style) allows them to be passed through
// but also requires the device id to capture to be set in optional
// as sourceId otherwise the constraints are considered malformed.
if (!constraints.audio.optional) {
constraints.audio.optional = [];
}
constraints.audio.optional.push(
{ sourceId: options.micDeviceId },
{ echoCancellation: !disableAEC && !disableAP },
{ googEchoCancellation: !disableAEC && !disableAP },
{ googAutoGainControl: !disableAGC && !disableAP },
{ googNoiseSuppression: !disableNS && !disableAP },
{ googHighpassFilter: !disableHPF && !disableAP },
{ googNoiseSuppression2: !disableNS && !disableAP },
{ googEchoCancellation2: !disableAEC && !disableAP },
{ googAutoGainControl2: !disableAGC && !disableAP }
);
}
constraints.audio.optional.push(
{ sourceId: options.micDeviceId },
{ echoCancellation: !disableAEC && !disableAP },
{ googEchoCancellation: !disableAEC && !disableAP },
{ googAutoGainControl: !disableAGC && !disableAP },
{ googNoiseSuppression: !disableNS && !disableAP },
{ googHighpassFilter: !disableHPF && !disableAP },
{ googNoiseSuppression2: !disableNS && !disableAP },
{ googEchoCancellation2: !disableAEC && !disableAP },
{ googAutoGainControl2: !disableAGC && !disableAP }
);
} else {

@@ -589,3 +647,20 @@ constraints.audio = false;

/**
* Update known devices.
*
* @param {Array<Object>} pds - The new devices.
* @returns {void}
*
* NOTE: Use this function as a shared callback to handle both the devicechange event and the polling implementations.
* This prevents duplication and works around a chrome bug (verified to occur on 68) where devicechange fires twice in
* a row, which can cause async post devicechange processing to collide.
*/
function updateKnownDevices(pds) {
if (compareAvailableMediaDevices(pds)) {
onMediaDevicesListChanged(pds);
}
}
/**
* Event handler for the 'devicechange' event.

@@ -786,3 +861,3 @@ *

availableDevices = undefined;
availableDevices = [];
window.clearInterval(availableDevicesPollTimer);

@@ -848,3 +923,3 @@ availableDevicesPollTimer = undefined;

this._initPCConstraints(options);
this._initPCConstraints();

@@ -866,18 +941,6 @@ screenObtainer.init(

// Use a shared callback to handle both the devicechange event
// and the polling implementations. This prevents duplication
// and works around a chrome bug (verified to occur on 68) where
// devicechange fires twice in a row, which can cause async post
// devicechange processing to collide.
const updateKnownDevices = () => this.enumerateDevices(pds => {
if (compareAvailableMediaDevices(pds)) {
onMediaDevicesListChanged(pds);
}
});
if (browser.supportsDeviceChangeEvent()) {
navigator.mediaDevices.addEventListener(
'devicechange',
updateKnownDevices);
() => this.enumerateDevices(emptyFuncton));
} else {

@@ -887,3 +950,3 @@ // Periodically poll enumerateDevices() method to check if

availableDevicesPollTimer = window.setInterval(
updateKnownDevices,
() => this.enumerateDevices(emptyFuncton),
AVAILABLE_DEVICES_POLL_INTERVAL_TIME);

@@ -898,13 +961,4 @@ }

* and outside of p2p.
*
* @params {Object} options - Configuration for setting RTCUtil's instance
* objects for peer connection constraints.
* @params {boolean} options.useIPv6 - Set to true if IPv6 should be used.
* @params {Object} options.testing - Additional configuration for work in
* development.
* @params {Object} options.testing.forceP2PSuspendVideoRatio - True if
* video should become suspended if bandwidth estimation becomes low while
* in peer to peer connection mode.
*/
_initPCConstraints(options) {
_initPCConstraints() {
if (browser.isFirefox()) {

@@ -923,7 +977,2 @@ this.pcConstraints = {};

if (options.useIPv6) {
// https://code.google.com/p/webrtc/issues/detail?id=2828
this.pcConstraints.optional.push({ googIPv6: true });
}
this.p2pPcConstraints

@@ -950,2 +999,4 @@ = JSON.parse(JSON.stringify(this.pcConstraints));

* @param {Object} options.frameRate.max - Maximum fps
* @param {bool} options.screenShareAudio - Used by electron clients to
* enable system audio screen sharing.
* @returns {Promise} Returns a media stream on success or a JitsiTrackError

@@ -961,13 +1012,13 @@ * on failure.

navigator.mediaDevices.getUserMedia(constraints)
.then(stream => {
logger.log('onUserMediaSuccess');
updateGrantedPermissions(um, stream);
resolve(stream);
})
.catch(error => {
logger.warn('Failed to get access to local media. '
+ ` ${error} ${constraints} `);
updateGrantedPermissions(um, undefined);
reject(new JitsiTrackError(error, constraints, um));
});
.then(stream => {
logger.log('onUserMediaSuccess');
updateGrantedPermissions(um, stream);
resolve(stream);
})
.catch(error => {
logger.warn('Failed to get access to local media. '
+ ` ${error} ${constraints} `);
updateGrantedPermissions(um, undefined);
reject(new JitsiTrackError(error, constraints, um));
});
});

@@ -1007,3 +1058,2 @@ }

* @param {Object} options
* @param {Object} options.desktopSharingExtensionExternalInstallation
* @param {string[]} options.desktopSharingSources

@@ -1179,3 +1229,2 @@ * @param {Object} options.desktopSharingFrameRate

return {
...options.desktopSharingExtensionExternalInstallation,
desktopSharingSources: options.desktopSharingSources,

@@ -1232,3 +1281,2 @@ gumOptions: {

const {
desktopSharingExtensionExternalInstallation,
desktopSharingSourceDevice,

@@ -1290,3 +1338,2 @@ desktopSharingSources,

return this._newGetDesktopMedia({
desktopSharingExtensionExternalInstallation,
desktopSharingSources,

@@ -1313,9 +1360,28 @@ desktopSharingFrameRate

mediaStreamsMetaData.push({
stream,
sourceId,
sourceType,
track: stream.getVideoTracks()[0],
videoType: VideoType.DESKTOP
});
const desktopAudioTracks = stream.getAudioTracks();
if (desktopAudioTracks.length) {
const desktopAudioStream = new MediaStream(desktopAudioTracks);
mediaStreamsMetaData.push({
stream: desktopAudioStream,
sourceId,
sourceType,
track: desktopAudioStream.getAudioTracks()[0]
});
}
const desktopVideoTracks = stream.getVideoTracks();
if (desktopVideoTracks.length) {
const desktopVideoStream = new MediaStream(desktopVideoTracks);
mediaStreamsMetaData.push({
stream: desktopVideoStream,
sourceId,
sourceType,
track: desktopVideoStream.getVideoTracks()[0],
videoType: VideoType.DESKTOP
});
}
};

@@ -1425,3 +1491,3 @@

? isAudioOutputDeviceChangeAvailable
: !browser.isSafariWithVP8();
: true;
}

@@ -1428,0 +1494,0 @@

@@ -0,3 +1,4 @@

import browser from '../browser';
import RTCUtils from './RTCUtils';
import browser from '../browser';
import screenObtainer from './ScreenObtainer';

@@ -4,0 +5,0 @@

@@ -1,2 +0,1 @@

/* global chrome, $, alert */

@@ -8,26 +7,6 @@ import JitsiTrackError from '../../JitsiTrackError';

const logger = require('jitsi-meet-logger').getLogger(__filename);
const GlobalOnErrorHandler = require('../util/GlobalOnErrorHandler');
/**
* Indicates whether the Chrome desktop sharing extension is installed.
* @type {boolean}
*/
let chromeExtInstalled = false;
/**
* Indicates whether an update of the Chrome desktop sharing extension is
* required.
* @type {boolean}
*/
let chromeExtUpdateRequired = false;
let gumFunction = null;
/**
* The error message returned by chrome when the extension is installed.
*/
const CHROME_NO_EXTENSION_ERROR_MSG // eslint-disable-line no-unused-vars
= 'Could not establish connection. Receiving end does not exist.';
/**
* Handles obtaining a stream from a screen capture on different browsers.

@@ -42,3 +21,2 @@ */

*/
intChromeExtPromise: null,

@@ -52,16 +30,9 @@ obtainStream: null,

* @param {object} options
* @param {boolean} [options.desktopSharingChromeDisabled]
* @param {boolean} [options.desktopSharingChromeExtId]
* @param {boolean} [options.desktopSharingFirefoxDisabled]
* @param {Function} gum GUM method
*/
init(options = {
desktopSharingChromeDisabled: false,
desktopSharingChromeExtId: null,
desktopSharingFirefoxDisabled: false
}, gum) {
init(options = {}, gum) {
this.options = options;
gumFunction = gum;
this.obtainStream = this._createObtainStreamMethod(options);
this.obtainStream = this._createObtainStreamMethod();

@@ -77,8 +48,6 @@ if (!this.obtainStream) {

*
* @param {object} options passed from {@link init} - check description
* there
* @returns {Function}
* @private
*/
_createObtainStreamMethod(options) {
_createObtainStreamMethod() {
if (browser.isNWJS()) {

@@ -109,3 +78,3 @@ return (_, onSuccess, onFailure) => {

jitsiError = new JitsiTrackError(
JitsiTrackErrors.CHROME_EXTENSION_USER_CANCELED
JitsiTrackErrors.SCREENSHARING_USER_CANCELED
);

@@ -122,36 +91,7 @@ } else {

return this.obtainScreenOnElectron;
} else if (browser.isChrome() || browser.isOpera()) {
if (browser.supportsGetDisplayMedia()
&& !options.desktopSharingChromeDisabled) {
return this.obtainScreenFromGetDisplayMedia;
} else if (options.desktopSharingChromeDisabled
|| !options.desktopSharingChromeExtId) {
return null;
}
logger.info('Using Chrome extension for desktop sharing');
this.intChromeExtPromise
= initChromeExtension(options).then(() => {
this.intChromeExtPromise = null;
});
return this.obtainScreenFromExtension;
} else if (browser.isFirefox()) {
if (options.desktopSharingFirefoxDisabled) {
return null;
} else if (browser.supportsGetDisplayMedia()) {
// Firefox 66 support getDisplayMedia
return this.obtainScreenFromGetDisplayMedia;
}
// Legacy Firefox
return this.obtainScreenOnFirefox;
} else if (browser.supportsGetDisplayMedia()) {
return this.obtainScreenFromGetDisplayMedia;
}
logger.log('Screen sharing not supported on ', browser.getName());
logger.log(
'Screen sharing not supported by the current browser: ',
browser.getName());
return null;

@@ -170,11 +110,2 @@ },

/**
* Obtains a screen capture stream on Firefox.
* @param callback
* @param errorCallback
*/
obtainScreenOnFirefox(options, callback, errorCallback) {
obtainWebRTCScreen(options.gumOptions, callback, errorCallback);
},
/**
* Obtains a screen capture stream on Electron.

@@ -196,6 +127,5 @@ *

{
desktopSharingSources: desktopSharingSources
|| this.options.desktopSharingChromeSources
desktopSharingSources: desktopSharingSources || [ 'screen', 'window' ]
},
(streamId, streamType) =>
(streamId, streamType, screenShareAudio = false) =>
onGetStreamResponse(

@@ -205,3 +135,4 @@ {

streamId,
streamType
streamType,
screenShareAudio
},

@@ -225,84 +156,2 @@ gumOptions

/**
* Asks Chrome extension to call chooseDesktopMedia and gets chrome
* 'desktop' stream for returned stream token.
*/
obtainScreenFromExtension(options, streamCallback, failCallback) {
if (this.intChromeExtPromise !== null) {
this.intChromeExtPromise.then(() => {
this.obtainScreenFromExtension(
options, streamCallback, failCallback);
});
return;
}
const {
desktopSharingChromeExtId,
desktopSharingChromeSources
} = this.options;
const {
gumOptions
} = options;
const doGetStreamFromExtensionOptions = {
desktopSharingChromeExtId,
desktopSharingChromeSources:
options.desktopSharingSources || desktopSharingChromeSources,
gumOptions
};
if (chromeExtInstalled) {
doGetStreamFromExtension(
doGetStreamFromExtensionOptions,
streamCallback,
failCallback);
} else {
if (chromeExtUpdateRequired) {
/* eslint-disable no-alert */
alert(
'Jitsi Desktop Streamer requires update. '
+ 'Changes will take effect after next Chrome restart.');
/* eslint-enable no-alert */
}
this.handleExternalInstall(options, streamCallback,
failCallback);
}
},
/* eslint-disable max-params */
handleExternalInstall(options, streamCallback, failCallback, e) {
const webStoreInstallUrl = getWebStoreInstallUrl(this.options);
options.listener('waitingForExtension', webStoreInstallUrl);
this.checkForChromeExtensionOnInterval(options, streamCallback,
failCallback, e);
},
/* eslint-enable max-params */
checkForChromeExtensionOnInterval(options, streamCallback, failCallback) {
if (options.checkAgain() === false) {
failCallback(new JitsiTrackError(
JitsiTrackErrors.CHROME_EXTENSION_INSTALLATION_ERROR));
return;
}
waitForExtensionAfterInstall(this.options, options.interval, 1)
.then(() => {
chromeExtInstalled = true;
options.listener('extensionFound');
this.obtainScreenFromExtension(options,
streamCallback, failCallback);
})
.catch(() => {
this.checkForChromeExtensionOnInterval(options,
streamCallback, failCallback);
});
},
/**
* Obtains a screen capture stream using getDisplayMedia.

@@ -325,3 +174,7 @@ *

getDisplayMedia({ video: true })
getDisplayMedia({
video: true,
audio: true,
cursor: 'always'
})
.then(stream => {

@@ -333,4 +186,8 @@ let applyConstraintsPromise;

&& stream.getTracks().length > 0) {
applyConstraintsPromise = stream.getTracks()[0]
.applyConstraints(options.trackOptions);
const videoTrack = stream.getVideoTracks()[0];
// Apply video track constraint.
if (videoTrack) {
applyConstraintsPromise = videoTrack.applyConstraints(options.trackOptions);
}
} else {

@@ -346,235 +203,25 @@ applyConstraintsPromise = Promise.resolve();

})
.catch(() =>
errorCallback(new JitsiTrackError(JitsiTrackErrors
.CHROME_EXTENSION_USER_CANCELED)));
}
};
.catch(error => {
const errorDetails = {
errorName: error && error.name,
errorMsg: error && error.message,
errorStack: error && error.stack
};
/**
* Obtains a desktop stream using getUserMedia.
* For this to work on Chrome, the
* 'chrome://flags/#enable-usermedia-screen-capture' flag must be enabled.
*
* On firefox, the document's domain must be white-listed in the
* 'media.getusermedia.screensharing.allowed_domains' preference in
* 'about:config'.
*/
function obtainWebRTCScreen(options, streamCallback, failCallback) {
gumFunction([ 'screen' ], options)
.then(stream => streamCallback({ stream }), failCallback);
}
logger.error('getDisplayMedia error', errorDetails);
/**
* Constructs inline install URL for Chrome desktop streaming extension.
* The 'chromeExtensionId' must be defined in options parameter.
* @param options supports "desktopSharingChromeExtId"
* @returns {string}
*/
function getWebStoreInstallUrl(options) {
return (
`https://chrome.google.com/webstore/detail/${
options.desktopSharingChromeExtId}`);
}
if (errorDetails.errorMsg && errorDetails.errorMsg.indexOf('denied by system') !== -1) {
// On Chrome this is the only thing different between error returned when user cancels
// and when no permission was given on the OS level.
errorCallback(new JitsiTrackError(JitsiTrackErrors.PERMISSION_DENIED));
/**
* Checks whether an update of the Chrome extension is required.
* @param minVersion minimal required version
* @param extVersion current extension version
* @returns {boolean}
*/
function isUpdateRequired(minVersion, extVersion) {
try {
const s1 = minVersion.split('.');
const s2 = extVersion.split('.');
return;
}
const len = Math.max(s1.length, s2.length);
for (let i = 0; i < len; i++) {
let n1 = 0,
n2 = 0;
if (i < s1.length) {
n1 = parseInt(s1[i], 10);
}
if (i < s2.length) {
n2 = parseInt(s2[i], 10);
}
if (isNaN(n1) || isNaN(n2)) {
return true;
} else if (n1 !== n2) {
return n1 > n2;
}
}
// will happen if both versions have identical numbers in
// their components (even if one of them is longer, has more components)
return false;
} catch (e) {
GlobalOnErrorHandler.callErrorHandler(e);
logger.error('Failed to parse extension version', e);
return true;
errorCallback(new JitsiTrackError(JitsiTrackErrors.SCREENSHARING_USER_CANCELED));
});
}
}
};
/**
*
* @param callback
* @param options
*/
function checkChromeExtInstalled(callback, options) {
if (typeof chrome === 'undefined' || !chrome || !chrome.runtime) {
// No API, so no extension for sure
callback(false, false);
return;
}
chrome.runtime.sendMessage(
options.desktopSharingChromeExtId,
{ getVersion: true },
response => {
if (!response || !response.version) {
// Communication failure - assume that no endpoint exists
logger.warn(
'Extension not installed?: ', chrome.runtime.lastError);
callback(false, false);
return;
}
// Check installed extension version
const extVersion = response.version;
logger.log(`Extension version is: ${extVersion}`);
const updateRequired
= isUpdateRequired(
options.desktopSharingChromeMinExtVersion,
extVersion);
callback(!updateRequired, updateRequired);
}
);
}
/**
*
* @param options
* @param streamCallback
* @param failCallback
*/
function doGetStreamFromExtension(options, streamCallback, failCallback) {
const {
desktopSharingChromeSources,
desktopSharingChromeExtId,
gumOptions
} = options;
// Sends 'getStream' msg to the extension.
// Extension id must be defined in the config.
chrome.runtime.sendMessage(
desktopSharingChromeExtId,
{
getStream: true,
sources: desktopSharingChromeSources
},
response => {
if (!response) {
// possibly re-wraping error message to make code consistent
const lastError = chrome.runtime.lastError;
failCallback(lastError instanceof Error
? lastError
: new JitsiTrackError(
JitsiTrackErrors.CHROME_EXTENSION_GENERIC_ERROR,
lastError));
return;
}
logger.log('Response from extension: ', response);
onGetStreamResponse(
{
response,
gumOptions
},
streamCallback,
failCallback
);
}
);
}
/**
* Initializes <link rel=chrome-webstore-item /> with extension id set in
* config.js to support inline installs. Host site must be selected as main
* website of published extension.
* @param options supports "desktopSharingChromeExtId"
*/
function initInlineInstalls(options) {
if ($('link[rel=chrome-webstore-item]').length === 0) {
$('head').append('<link rel="chrome-webstore-item">');
}
$('link[rel=chrome-webstore-item]').attr('href',
getWebStoreInstallUrl(options));
}
/**
*
* @param options
*
* @return {Promise} - a Promise resolved once the initialization process is
* finished.
*/
function initChromeExtension(options) {
// Initialize Chrome extension inline installs
initInlineInstalls(options);
return new Promise(resolve => {
// Check if extension is installed
checkChromeExtInstalled((installed, updateRequired) => {
chromeExtInstalled = installed;
chromeExtUpdateRequired = updateRequired;
logger.info(
`Chrome extension installed: ${
chromeExtInstalled} updateRequired: ${
chromeExtUpdateRequired}`);
resolve();
}, options);
});
}
/**
* Checks "retries" times on every "waitInterval"ms whether the ext is alive.
* @param {Object} options the options passed to ScreanObtainer.obtainStream
* @param {int} waitInterval the number of ms between retries
* @param {int} retries the number of retries
* @returns {Promise} returns promise that will be resolved when the extension
* is alive and rejected if the extension is not alive even after "retries"
* checks
*/
function waitForExtensionAfterInstall(options, waitInterval, retries) {
if (retries === 0) {
return Promise.reject();
}
return new Promise((resolve, reject) => {
let currentRetries = retries;
const interval = window.setInterval(() => {
checkChromeExtInstalled(installed => {
if (installed) {
window.clearInterval(interval);
resolve();
} else {
currentRetries--;
if (currentRetries === 0) {
reject();
window.clearInterval(interval);
}
}
}, options);
}, waitInterval);
});
}
/**
* Handles response from external application / extension and calls GUM to

@@ -586,2 +233,4 @@ * receive the desktop streams or reports error.

* stream.
* @param {bool} options.response.screenShareAudio - Used by electron clients to
* enable system audio screen sharing.
* @param {string} options.response.error - error to be reported.

@@ -600,3 +249,3 @@ * @param {object} options.gumOptions - options passed to GUM.

onFailure) {
const { streamId, streamType, error } = options.response || {};
const { streamId, streamType, screenShareAudio, error } = options.response || {};

@@ -606,2 +255,3 @@ if (streamId) {

desktopStream: streamId,
screenShareAudio,
...options.gumOptions

@@ -622,3 +272,3 @@ };

onFailure(new JitsiTrackError(
JitsiTrackErrors.CHROME_EXTENSION_USER_CANCELED));
JitsiTrackErrors.SCREENSHARING_USER_CANCELED));

@@ -629,3 +279,3 @@ return;

onFailure(new JitsiTrackError(
JitsiTrackErrors.CHROME_EXTENSION_GENERIC_ERROR,
JitsiTrackErrors.SCREENSHARING_GENERIC_ERROR,
error));

@@ -632,0 +282,0 @@ }

@@ -0,2 +1,4 @@

import { jitsiLocalStorage } from '@jitsi/js-utils';
import { getLogger } from 'jitsi-meet-logger';
const logger = getLogger(__filename);

@@ -14,3 +16,19 @@

export default {
/**
* The storage used to store the settings.
*/
_storage: jitsiLocalStorage,
/**
* Initializes the Settings class.
*
* @param {Storage|undefined} externalStorage - Object that implements the Storage interface. This object will be
* used for storing data instead of jitsiLocalStorage if specified.
*/
init(externalStorage) {
this._storage = externalStorage || jitsiLocalStorage;
},
/**
* Returns fake username for callstats

@@ -21,14 +39,6 @@ * @returns {string} fake username for callstats

if (!_callStatsUserName) {
const localStorage = getLocalStorage();
if (localStorage) {
_callStatsUserName = localStorage.getItem('callStatsUserName');
}
_callStatsUserName = this._storage.getItem('callStatsUserName');
if (!_callStatsUserName) {
_callStatsUserName = generateCallStatsUserName();
if (localStorage) {
localStorage.setItem(
'callStatsUserName',
_callStatsUserName);
}
this._storage.setItem('callStatsUserName', _callStatsUserName);
}

@@ -46,12 +56,6 @@ }

if (!_machineId) {
const localStorage = getLocalStorage();
if (localStorage) {
_machineId = localStorage.getItem('jitsiMeetId');
}
_machineId = this._storage.getItem('jitsiMeetId');
if (!_machineId) {
_machineId = generateJitsiMeetId();
if (localStorage) {
localStorage.setItem('jitsiMeetId', _machineId);
}
this._storage.setItem('jitsiMeetId', _machineId);
}

@@ -70,5 +74,3 @@ }

// instance and that's why we should always re-read it.
const localStorage = getLocalStorage();
return localStorage ? localStorage.getItem('sessionId') : undefined;
return this._storage.getItem('sessionId');
},

@@ -81,10 +83,6 @@

set sessionId(sessionId) {
const localStorage = getLocalStorage();
if (localStorage) {
if (sessionId) {
localStorage.setItem('sessionId', sessionId);
} else {
localStorage.removeItem('sessionId');
}
if (sessionId) {
this._storage.setItem('sessionId', sessionId);
} else {
this._storage.removeItem('sessionId');
}

@@ -119,21 +117,2 @@ }

/**
* Gets the localStorage of the browser. (Technically, gets the localStorage of
* the global object because there may be no browser but React Native for
* example).
* @returns {Storage} the local Storage object (if any)
*/
function getLocalStorage() {
let storage;
try {
// eslint-disable-next-line no-invalid-this
storage = (window || this).localStorage;
} catch (error) {
logger.error(error);
}
return storage;
}
/**
*

@@ -140,0 +119,0 @@ */

@@ -0,1 +1,3 @@

import { getLogger } from 'jitsi-meet-logger';
import {

@@ -7,3 +9,2 @@ TYPE_OPERATIONAL,

} from '../../service/statistics/AnalyticsEvents';
import { getLogger } from 'jitsi-meet-logger';
import browser from '../browser';

@@ -10,0 +11,0 @@

import { getLogger } from 'jitsi-meet-logger';
import * as ConferenceEvents from '../../JitsiConferenceEvents';
import * as MediaType from '../../service/RTC/MediaType';
import * as ConnectionQualityEvents from '../../service/connectivity/ConnectionQualityEvents';
import * as MediaType from '../../service/RTC/MediaType';
import { createAudioOutputProblemEvent } from '../../service/statistics/AnalyticsEvents';

@@ -7,0 +7,0 @@

/* global __filename */
import { getLogger } from 'jitsi-meet-logger';
import isEqual from 'lodash.isequal';
import * as ConferenceEvents from '../../JitsiConferenceEvents';
import * as MediaType from '../../service/RTC/MediaType';
import * as VideoType from '../../service/RTC/VideoType';
import * as ConnectionQualityEvents
from '../../service/connectivity/ConnectionQualityEvents';
import {

@@ -8,11 +14,7 @@ createRtpStatsEvent,

} from '../../service/statistics/AnalyticsEvents';
import { getLogger } from 'jitsi-meet-logger';
import * as ConnectionQualityEvents
from '../../service/connectivity/ConnectionQualityEvents';
import * as ConferenceEvents from '../../JitsiConferenceEvents';
import * as MediaType from '../../service/RTC/MediaType';
import browser from '../browser';
import Statistics from './statistics';
import * as VideoType from '../../service/RTC/VideoType';
const logger = getLogger(__filename);

@@ -19,0 +21,0 @@

@@ -379,3 +379,3 @@ /* global callstats */

// if there is no tenant, we will just set '/'
configParams.siteID = (match && match[1]) || '/';
configParams.siteID = options.siteID || (match && match[1]) || '/';
}

@@ -382,0 +382,0 @@

@@ -110,5 +110,2 @@ /**

const self = this;
this.intervalId = setInterval(

@@ -121,6 +118,8 @@ () => {

if (audioLevel !== self.audioLevel) {
self.audioLevel = animateLevel(audioLevel, self.audioLevel);
self.callback(self.audioLevel);
}
// Set the audio levels always as NoAudioSignalDetection now
// uses audio levels from LocalStatsCollector and waits for
// atleast 4 secs for a no audio signal before displaying the
// notification on the UI.
this.audioLevel = animateLevel(audioLevel, this.audioLevel);
this.callback(this.audioLevel);
},

@@ -127,0 +126,0 @@ this.intervalMilis

@@ -1,10 +0,12 @@

import browser from '../browser';
import { browsers } from '@jitsi/js-utils';
import { getLogger } from 'jitsi-meet-logger';
import * as MediaType from '../../service/RTC/MediaType';
import * as StatisticsEvents from '../../service/statistics/Events';
import * as MediaType from '../../service/RTC/MediaType';
import browser from '../browser';
const GlobalOnErrorHandler = require('../util/GlobalOnErrorHandler');
const logger = require('jitsi-meet-logger').getLogger(__filename);
const logger = getLogger(__filename);
/**

@@ -24,5 +26,5 @@ * The lib-jitsi-meet browser-agnostic names of the browser-specific keys

'framerateMean': 'framerateMean',
'ip': 'ipAddress',
'port': 'portNumber',
'protocol': 'transport'
'ip': 'address',
'port': 'port',
'protocol': 'protocol'
};

@@ -42,2 +44,3 @@ KEYS_BY_BROWSER_TYPE[browsers.CHROME] = {

'bytesSent': 'bytesSent',
'googCodecName': 'googCodecName',
'googFrameHeightReceived': 'googFrameHeightReceived',

@@ -97,2 +100,3 @@ 'googFrameWidthReceived': 'googFrameWidthReceived',

this.framerate = 0;
this.codec = '';
}

@@ -143,2 +147,6 @@

SsrcStats.prototype.setCodec = function(codec) {
this.codec = codec || '';
};
/**

@@ -222,3 +230,3 @@ *

this._usesPromiseGetStats
= browser.isSafariWithWebrtc() || browser.isFirefox();
= browser.isSafari() || browser.isFirefox();

@@ -292,32 +300,52 @@ /**

StatsCollector.prototype.start = function(startAudioLevelStats) {
const self = this;
if (startAudioLevelStats) {
if (browser.supportsReceiverStats()) {
logger.info('Using RTCRtpSynchronizationSource for remote audio levels');
}
this.audioLevelsIntervalId = setInterval(
() => {
// Interval updates
self.peerconnection.getStats(
report => {
let results = null;
if (browser.supportsReceiverStats()) {
const audioLevels = this.peerconnection.getAudioLevels();
if (!report || !report.result
|| typeof report.result !== 'function') {
results = report;
} else {
results = report.result();
for (const ssrc in audioLevels) {
if (audioLevels.hasOwnProperty(ssrc)) {
// Use a scaling factor of 2.5 to report the same
// audio levels that getStats reports.
const audioLevel = audioLevels[ssrc] * 2.5;
this.eventEmitter.emit(
StatisticsEvents.AUDIO_LEVEL,
this.peerconnection,
Number.parseInt(ssrc, 10),
audioLevel,
false /* isLocal */);
}
self.currentAudioLevelsReport = results;
if (this._usesPromiseGetStats) {
self.processNewAudioLevelReport();
} else {
self.processAudioLevelReport();
}
}
} else {
// Interval updates
this.peerconnection.getStats(
report => {
let results = null;
self.baselineAudioLevelsReport
= self.currentAudioLevelsReport;
},
error => self.errorCallback(error)
);
if (!report || !report.result
|| typeof report.result !== 'function') {
results = report;
} else {
results = report.result();
}
this.currentAudioLevelsReport = results;
if (this._usesPromiseGetStats) {
this.processNewAudioLevelReport();
} else {
this.processAudioLevelReport();
}
this.baselineAudioLevelsReport
= this.currentAudioLevelsReport;
},
error => this.errorCallback(error)
);
}
},
self.audioLevelsIntervalMilis
this.audioLevelsIntervalMilis
);

@@ -329,3 +357,3 @@ }

// Interval updates
self.peerconnection.getStats(
this.peerconnection.getStats(
report => {

@@ -343,8 +371,8 @@ let results = null;

self.currentStatsReport = results;
this.currentStatsReport = results;
try {
if (this._usesPromiseGetStats) {
self.processNewStatsReport();
this.processNewStatsReport();
} else {
self.processStatsReport();
this.processStatsReport();
}

@@ -356,8 +384,8 @@ } catch (e) {

self.previousStatsReport = self.currentStatsReport;
this.previousStatsReport = this.currentStatsReport;
},
error => self.errorCallback(error)
error => this.errorCallback(error)
);
},
self.statsIntervalMilis
this.statsIntervalMilis
);

@@ -705,4 +733,14 @@ };

}
let codec;
// Try to get the codec for later reporting.
try {
codec = getStatValue(now, 'googCodecName') || '';
} catch (e) { /* not supported*/ }
ssrcStats.setCodec(codec);
}
this.eventEmitter.emit(

@@ -731,6 +769,9 @@ StatisticsEvents.BYTE_SENT_STATS, this.peerconnection, byteSentStats);

const framerates = {};
const codecs = {};
let audioBitrateDownload = 0;
let audioBitrateUpload = 0;
let audioCodec = '';
let videoBitrateDownload = 0;
let videoBitrateUpload = 0;
let videoCodec = '';

@@ -756,5 +797,7 @@ for (const [ ssrc, ssrcStats ] of this.ssrc2stats) {

audioBitrateUpload += ssrcStats.bitrate.upload;
audioCodec = ssrcStats.codec;
} else {
videoBitrateDownload += ssrcStats.bitrate.download;
videoBitrateUpload += ssrcStats.bitrate.upload;
videoCodec = ssrcStats.codec;
}

@@ -782,2 +825,13 @@

}
if (audioCodec.length && videoCodec.length) {
const codecDesc = {
'audio': audioCodec,
'video': videoCodec
};
const userCodecs = codecs[participantId] || {};
userCodecs[ssrc] = codecDesc;
codecs[participantId] = userCodecs;
}
} else {

@@ -849,2 +903,3 @@ logger.error(`No participant ID returned by ${track}`);

'framerate': framerates,
'codec': codecs,
'transport': this.conferenceStats.transport,

@@ -1073,4 +1128,2 @@ localAvgAudioLevels,

if (remoteUsedCandidate && localUsedCandidate) {
// FF uses non-standard ipAddress, portNumber, transport
// instead of ip, port, protocol
const remoteIpAddress = getStatValue(remoteUsedCandidate, 'ip');

@@ -1226,2 +1279,6 @@ const remotePort = getStatValue(remoteUsedCandidate, 'port');

const ssrc = this.peerconnection.getSsrcByTrackId(trackIdentifier);
if (!ssrc) {
return;
}
let ssrcStats = this.ssrc2stats.get(ssrc);

@@ -1228,0 +1285,0 @@

import * as JitsiConferenceEvents from '../../JitsiConferenceEvents';
import SpeakerStats from './SpeakerStats';
import XMPPEvents from '../../service/xmpp/XMPPEvents';
import SpeakerStats from './SpeakerStats';
/**

@@ -6,0 +7,0 @@ * A collection for tracking speaker stats. Attaches listeners

import EventEmitter from 'events';
import JitsiConference from '../../JitsiConference';
import * as JitsiConferenceEvents from '../../JitsiConferenceEvents';
import JitsiParticipant from '../../JitsiParticipant';
import SpeakerStats from './SpeakerStats';

@@ -6,0 +8,0 @@ import SpeakerStatsCollector from './SpeakerStatsCollector';

import EventEmitter from 'events';
import * as JitsiConferenceEvents from '../../JitsiConferenceEvents';
import JitsiTrackError from '../../JitsiTrackError';
import { FEEDBACK } from '../../service/statistics/AnalyticsEvents';
import * as StatisticsEvents from '../../service/statistics/Events';
import browser from '../browser';
import ScriptUtil from '../util/ScriptUtil';
import analytics from './AnalyticsAdapter';
import CallStats from './CallStats';
import LocalStats from './LocalStatsCollector';
import { PerformanceObserverStats } from './PerformanceObserverStats';
import RTPStats from './RTPStatsCollector';
import { CALLSTATS_SCRIPT_URL } from './constants';
import browser from '../browser';
import ScriptUtil from '../util/ScriptUtil';
import JitsiTrackError from '../../JitsiTrackError';
import * as StatisticsEvents from '../../service/statistics/Events';
const logger = require('jitsi-meet-logger').getLogger(__filename);

@@ -42,4 +45,3 @@

ScriptUtil.loadScript(
options.customScriptUrl
|| 'https://api.callstats.io/static/callstats-ws.min.js',
options.customScriptUrl || CALLSTATS_SCRIPT_URL,
/* async */ true,

@@ -73,3 +75,4 @@ /* prepend */ true,

getWiFiStatsMethod: options.getWiFiStatsMethod,
confID: options.confID
confID: options.confID,
siteID: options.siteID
})) {

@@ -123,2 +126,6 @@ logger.error('CallStats Backend initialization failed bad');

if (typeof options.longTasksStatsInterval === 'number') {
Statistics.longTasksStatsInterval = options.longTasksStatsInterval;
}
Statistics.disableThirdPartyRequests = options.disableThirdPartyRequests;

@@ -134,4 +141,2 @@ };

* @property {string} userName - The user name to use when initializing callstats.
* @property {string} callStatsConfIDNamespace - A namespace to prepend the
* callstats conference ID with.
* @property {string} confID - The callstats conference ID to use.

@@ -162,3 +167,3 @@ * @property {string} callStatsID - Callstats credentials - the id.

this.callStatsIntegrationEnabled
= this.options.callStatsID && this.options.callStatsSecret
= this.options.callStatsID && this.options.callStatsSecret && this.options.enableCallStats

@@ -181,6 +186,2 @@ // Even though AppID and AppSecret may be specified, the integration

}
if (!this.options.callStatsConfIDNamespace) {
logger.warn('"callStatsConfIDNamespace" is not defined');
}
}

@@ -296,2 +297,59 @@

/**
* Add a listener that would be notified on a LONG_TASKS_STATS event.
*
* @param {Function} listener a function that would be called when notified.
* @returns {void}
*/
Statistics.prototype.addLongTasksStatsListener = function(listener) {
this.eventEmitter.on(StatisticsEvents.LONG_TASKS_STATS, listener);
};
/**
* Creates an instance of {@link PerformanceObserverStats} and starts the
* observer that records the stats periodically.
*
* @returns {void}
*/
Statistics.prototype.attachLongTasksStats = function(conference) {
if (!browser.supportsPerformanceObserver()) {
logger.warn('Performance observer for long tasks not supported by browser!');
return;
}
this.performanceObserverStats = new PerformanceObserverStats(
this.eventEmitter,
Statistics.longTasksStatsInterval);
conference.on(
JitsiConferenceEvents.CONFERENCE_JOINED,
() => this.performanceObserverStats.startObserver());
conference.on(
JitsiConferenceEvents.CONFERENCE_LEFT,
() => this.performanceObserverStats.stopObserver());
};
/**
* Obtains the current value of the LongTasks event statistics.
*
* @returns {Object|null} stats object if the observer has been
* created, null otherwise.
*/
Statistics.prototype.getLongTasksStats = function() {
return this.performanceObserverStats
? this.performanceObserverStats.getLongTasksStats()
: null;
};
/**
* Removes the given listener for the LONG_TASKS_STATS event.
*
* @param {Function} listener the listener we want to remove.
* @returns {void}
*/
Statistics.prototype.removeLongTasksStatsListener = function(listener) {
this.eventEmitter.removeListener(StatisticsEvents.LONG_TASKS_STATS, listener);
};
Statistics.prototype.dispose = function() {

@@ -383,3 +441,3 @@ try {

{
confID: this._getCallStatsConfID(),
confID: this.options.confID,
remoteUserID

@@ -410,15 +468,2 @@ });

/**
* Constructs the CallStats conference ID based on the options currently
* configured in this instance.
* @return {string}
* @private
*/
Statistics.prototype._getCallStatsConfID = function() {
// The conference ID is case sensitive!!!
return this.options.callStatsConfIDNamespace
? `${this.options.callStatsConfIDNamespace}/${this.options.roomName}`
: this.options.roomName;
};
/**
* Removes the callstats.io instances.

@@ -718,3 +763,3 @@ */

return CallStats.sendFeedback(this._getCallStatsConfID(), overall, comment);
return CallStats.sendFeedback(this.options.confID, overall, comment);
};

@@ -721,0 +766,0 @@

/* global config */
const TranscriptionService = require('./AbstractTranscriptionService');
const Word = require('../word');
const audioRecorder = require('./../audioRecorder');
const TranscriptionService = require('./AbstractTranscriptionService');

@@ -7,0 +8,0 @@ /**

/**

@@ -31,3 +30,22 @@ * The method will increase the given number by 1. If the given counter is equal

/**
* Calculates a unique hash for a given string similar to Java's
* implementation of String.hashCode()
*
* @param {String} string - String whose hash has to be calculated.
* @returns {number} - Unique hash code calculated.
*/
export function hashString(string) {
let hash = 0;
for (let i = 0; i < string.length; i++) {
hash += Math.pow(string.charCodeAt(i) * 31, string.length - i);
/* eslint-disable no-bitwise */
hash = hash & hash; // Convert to 32bit integer
}
return Math.abs(hash);
}
/**

@@ -42,1 +60,37 @@ * Returns only the positive values from an array of numbers.

}
/**
* This class calculates a simple running average that continually changes
* as more data points are collected and added.
*/
export class RunningAverage {
/**
* Creates an instance of the running average calculator.
*/
constructor() {
this.average = 0;
this.n = 0;
}
/**
* Adds a new data point to the existing set of values and recomputes
* the running average.
* @param {number} value
* @returns {void}
*/
addNext(value) {
if (typeof value !== 'number') {
return;
}
this.n += 1;
this.average = this.average + ((value - this.average) / this.n);
}
/**
* Obtains the average value for the current subset of values.
* @returns {number} - computed average.
*/
getAverage() {
return this.average;
}
}
import { getLogger } from 'jitsi-meet-logger';
const logger = getLogger(__filename);
import XMPPEvents from '../../service/xmpp/XMPPEvents';
import JitsiVideoSIPGWSession from './JitsiVideoSIPGWSession';
import * as Constants from './VideoSIPGWConstants';
import XMPPEvents from '../../service/xmpp/XMPPEvents';

@@ -8,0 +9,0 @@ /**

@@ -174,2 +174,4 @@ /* global $ */

}
return features;
});

@@ -187,4 +189,4 @@ }

*/
getFeaturesAndIdentities(jid, timeout = 5000) {
return this._getDiscoInfo(jid, null, timeout);
getFeaturesAndIdentities(jid, node, timeout = 5000) {
return this._getDiscoInfo(jid, node, timeout);
}

@@ -191,0 +193,0 @@

@@ -6,10 +6,11 @@ /* global $, __filename */

import GlobalOnErrorHandler from '../util/GlobalOnErrorHandler';
import * as JitsiTranscriptionStatus from '../../JitsiTranscriptionStatus';
import Listenable from '../util/Listenable';
import * as MediaType from '../../service/RTC/MediaType';
import XMPPEvents from '../../service/xmpp/XMPPEvents';
import GlobalOnErrorHandler from '../util/GlobalOnErrorHandler';
import Listenable from '../util/Listenable';
import Lobby from './Lobby';
import XmppConnection from './XmppConnection';
import Moderator from './moderator';
import XmppConnection from './XmppConnection';

@@ -85,2 +86,8 @@ const logger = getLogger(__filename);

/**
* Array of affiliations that are allowed in members only room.
* @type {string[]}
*/
const MEMBERS_AFFILIATIONS = [ 'owner', 'admin', 'member' ];
/**
*

@@ -100,4 +107,6 @@ */

* @param {boolean} options.disableFocus - when set to {@code false} will
* not invite Jicofo into the room. This is intended to be used only by
* jitsi-meet-spot.
* not invite Jicofo into the room.
* @param {boolean} options.disableDiscoInfo - when set to {@code false} will skip disco info.
* This is intended to be used only for lobby rooms.
* @param {boolean} options.enableLobby - when set to {@code false} will skip creating lobby room.
*/

@@ -126,2 +135,5 @@ constructor(connection, jid, password, XMPP, options) {

});
if (typeof this.options.enableLobby === 'undefined' || this.options.enableLobby) {
this.lobby = new Lobby(this);
}
this.initPresenceMap(options);

@@ -169,2 +181,4 @@ this.lastPresences = {};

}
this.presenceUpdateTime = Date.now();
}

@@ -174,7 +188,10 @@

* Joins the chat room.
* @param password
* @param {string} password - Password to unlock room on joining.
* @param {Object} customJoinPresenceExtensions - Key values object to be used
* for the initial presence, they key will be an xmpp node and its text is the value,
* and those will be added to the initial <x xmlns='http://jabber.org/protocol/muc'/>
* @returns {Promise} - resolved when join completes. At the time of this
* writing it's never rejected.
*/
join(password) {
join(password, customJoinPresenceExtensions) {
this.password = password;

@@ -184,3 +201,3 @@

this.options.disableFocus
&& logger.info('Conference focus disabled');
&& logger.info(`Conference focus disabled for ${this.roomjid}`);

@@ -193,3 +210,3 @@ const preJoin

preJoin.then(() => {
this.sendPresence(true);
this.sendPresence(true, customJoinPresenceExtensions);
this._removeConnListeners.push(

@@ -207,5 +224,6 @@ this.connection.addEventListener(

*
* @param fromJoin
* @param fromJoin - Whether this is initial presence to join the room.
* @param customJoinPresenceExtensions - Object of key values to be added to the initial presence only.
*/
sendPresence(fromJoin) {
sendPresence(fromJoin, customJoinPresenceExtensions) {
const to = this.presMap.to;

@@ -231,2 +249,7 @@

}
if (customJoinPresenceExtensions) {
Object.keys(customJoinPresenceExtensions).forEach(key => {
pres.c(key).t(customJoinPresenceExtensions[key]).up();
});
}
pres.up();

@@ -236,2 +259,6 @@ }

parser.json2packet(this.presMap.nodes, pres);
// we store time we last synced presence state
this.presenceSyncTime = Date.now();
this.connection.send(pres);

@@ -306,4 +333,19 @@ if (fromJoin) {

} else {
logger.trace('No meeting ID from backend');
logger.warn('No meeting ID from backend');
}
const membersOnly = $(result).find('>query>feature[var="muc_membersonly"]').length === 1;
const lobbyRoomField
= $(result).find('>query>x[type="result"]>field[var="muc#roominfo_lobbyroom"]>value');
if (this.lobby) {
this.lobby.setLobbyRoomJid(lobbyRoomField && lobbyRoomField.length ? lobbyRoomField.text() : undefined);
}
if (membersOnly !== this.membersOnlyEnabled) {
this.membersOnlyEnabled = membersOnly;
this.eventEmitter.emit(XMPPEvents.MUC_MEMBERS_ONLY_CHANGED, membersOnly);
}
}, error => {

@@ -337,2 +379,6 @@ GlobalOnErrorHandler.callErrorHandler(error);

if (this.options.disableDiscoInfo) {
return;
}
const getForm = $iq({ type: 'get',

@@ -344,4 +390,2 @@ to: this.roomjid })

const self = this;
this.connection.sendIQ(getForm, form => {

@@ -359,3 +403,3 @@ if (!$(form).find(

const formSubmit = $iq({ to: self.roomjid,
const formSubmit = $iq({ to: this.roomjid,
type: 'set' })

@@ -374,3 +418,3 @@ .c('query', { xmlns: 'http://jabber.org/protocol/muc#owner' });

self.connection.sendIQ(formSubmit);
this.connection.sendIQ(formSubmit);

@@ -539,3 +583,6 @@ }, error => {

// but blocked from sending, during the join process.
this.sendPresence();
// send the presence only if there was a modification after we had synced it
if (this.presenceUpdateTime >= this.presenceSyncTime) {
this.sendPresence();
}

@@ -546,3 +593,3 @@ this.eventEmitter.emit(XMPPEvents.MUC_JOINED);

// meeting Id if any
this.discoRoomInfo();
!this.options.disableDiscoInfo && this.discoRoomInfo();
}

@@ -572,3 +619,4 @@ } else if (jid === undefined) {

member.identity,
member.botType);
member.botType,
member.jid);

@@ -590,2 +638,7 @@ // we are reporting the status with the join

// affiliation changed
if (memberOfThis.affiliation !== member.affiliation) {
memberOfThis.affiliation = member.affiliation;
}
// fire event that botType had changed

@@ -731,2 +784,8 @@ if (memberOfThis.botType !== member.botType) {

logger.info(`Ignore focus: ${from}, real JID: ${mucJid}`);
this.xmpp.caps.getFeatures(mucJid, 15000).then(features => {
this.focusFeatures = features;
logger.info(`Jicofo supports restart by terminate: ${this.supportsRestartByTerminate()}`);
}, error => {
logger.error('Failed to discover Jicofo features', error && error.message);
});
}

@@ -743,2 +802,12 @@

/**
* Checks if Jicofo supports restarting Jingle session after 'session-terminate'.
* @returns {boolean}
*/
supportsRestartByTerminate() {
return this.focusFeatures
? this.focusFeatures.has('https://jitsi.org/meet/jicofo/terminate-restart')
: false;
}
/**
*

@@ -874,4 +943,5 @@ * @param node

// room destroyed ?
if ($(pres).find('>x[xmlns="http://jabber.org/protocol/muc#user"]'
+ '>destroy').length) {
const destroySelect = $(pres).find('>x[xmlns="http://jabber.org/protocol/muc#user"]>destroy');
if (destroySelect.length) {
let reason;

@@ -887,3 +957,3 @@ const reasonSelect

this.eventEmitter.emit(XMPPEvents.MUC_DESTROYED, reason);
this.eventEmitter.emit(XMPPEvents.MUC_DESTROYED, reason, destroySelect.attr('jid'));
this.connection.emuc.doLeave(this.roomjid);

@@ -920,20 +990,13 @@

// if no member is found this is the case we had kicked someone
// and we are not in the list of members
if (membersKeys.find(jid => Strophe.getResourceFromJid(jid) === actorNick)) {
// we first fire the kicked so we can show the participant
// who kicked, before notifying that participant left
// we fire kicked for us and for any participant kicked
this.eventEmitter.emit(
XMPPEvents.KICKED,
isSelfPresence,
actorNick,
Strophe.getResourceFromJid(from));
}
// we first fire the kicked so we can show the participant
// who kicked, before notifying that participant left
// we fire kicked for us and for any participant kicked
this.eventEmitter.emit(
XMPPEvents.KICKED,
isSelfPresence,
actorNick,
Strophe.getResourceFromJid(from));
}
if (!isSelfPresence) {
delete this.members[from];
this.onParticipantLeft(from, false);
} else if (membersKeys.length > 0) {
if (isSelfPresence) {
// If the status code is 110 this means we're leaving and we would

@@ -955,2 +1018,5 @@ // like to remove everyone else from our view, so we trigger the

}
} else {
delete this.members[from];
this.onParticipantLeft(from, false);
}

@@ -1008,17 +1074,35 @@ }

if (from === this.roomjid
&& $(msg).find('>x[xmlns="http://jabber.org/protocol/muc#user"]>status[code="104"]').length) {
this.discoRoomInfo();
if (from === this.roomjid) {
let invite;
if ($(msg).find('>x[xmlns="http://jabber.org/protocol/muc#user"]>status[code="104"]').length) {
this.discoRoomInfo();
} else if ((invite = $(msg).find('>x[xmlns="http://jabber.org/protocol/muc#user"]>invite'))
&& invite.length) {
const passwordSelect = $(msg).find('>x[xmlns="http://jabber.org/protocol/muc#user"]>password');
let password;
if (passwordSelect && passwordSelect.length) {
password = passwordSelect.text();
}
this.eventEmitter.emit(XMPPEvents.INVITE_MESSAGE_RECEIVED,
from, invite.attr('from'), txt, password);
}
}
const jsonMessage = $(msg).find('>json-message').text();
const parsedJson = this.xmpp.tryParseJSONAndVerify(jsonMessage);
// We emit this event if the message is a valid json, and is not
// delivered after a delay, i.e. stamp is undefined.
// e.g. - subtitles should not be displayed if delayed.
if (parsedJson && stamp === undefined) {
this.eventEmitter.emit(XMPPEvents.JSON_MESSAGE_RECEIVED,
from, parsedJson);
if (jsonMessage) {
const parsedJson = this.xmpp.tryParseJSONAndVerify(jsonMessage);
return;
// We emit this event if the message is a valid json, and is not
// delivered after a delay, i.e. stamp is undefined.
// e.g. - subtitles should not be displayed if delayed.
if (parsedJson && stamp === undefined) {
this.eventEmitter.emit(XMPPEvents.JSON_MESSAGE_RECEIVED,
from, parsedJson);
return;
}
}

@@ -1075,2 +1159,17 @@

this.eventEmitter.emit(XMPPEvents.ROOM_MAX_USERS_ERROR);
} else if ($(pres)
.find(
'>error[type="auth"]'
+ '>registration-required['
+ 'xmlns="urn:ietf:params:xml:ns:xmpp-stanzas"]').length) {
// let's extract the lobby jid from the custom field
const lobbyRoomNode = $(pres).find('>lobbyroom');
let lobbyRoomJid;
if (lobbyRoomNode.length) {
lobbyRoomJid = lobbyRoomNode.text();
}
this.eventEmitter.emit(XMPPEvents.ROOM_CONNECT_MEMBERS_ONLY_ERROR, lobbyRoomJid);
} else {

@@ -1085,3 +1184,27 @@ logger.warn('onPresError ', pres);

* @param jid
* @param affiliation
*/
setAffiliation(jid, affiliation) {
const grantIQ = $iq({
to: this.roomjid,
type: 'set'
})
.c('query', { xmlns: 'http://jabber.org/protocol/muc#admin' })
.c('item', {
affiliation,
nick: Strophe.getResourceFromJid(jid)
})
.c('reason').t(`Your affiliation has been changed to '${affiliation}'.`)
.up().up().up();
this.connection.sendIQ(
grantIQ,
result => logger.log('Set affiliation of participant with jid: ', jid, 'to', affiliation, result),
error => logger.log('Set affiliation of participant error: ', error));
}
/**
*
* @param jid
*/
kick(jid) {

@@ -1149,3 +1272,20 @@ const kickIQ = $iq({ to: this.roomjid,

.up();
formsubmit
.c('field',
{ 'var': 'muc#roomconfig_passwordprotectedroom' })
.c('value')
.t(key === null || key.length === 0 ? '0' : '1')
.up()
.up();
// if members only enabled
if (this.membersOnlyEnabled) {
formsubmit
.c('field', { 'var': 'muc#roomconfig_membersonly' })
.c('value')
.t('true')
.up()
.up();
}
// Fixes a bug in prosody 0.9.+

@@ -1160,3 +1300,2 @@ // https://prosody.im/issues/issue/373

// FIXME: is muc#roomconfig_passwordprotectedroom required?
this.connection.sendIQ(formsubmit, onSuccess, onError);

@@ -1173,3 +1312,82 @@ } else {

/**
* Turns off or on the members only config for the main room.
*
* @param {boolean} enabled - Whether to turn it on or off.
* @param onSuccess - optional callback.
* @param onError - optional callback.
*/
setMembersOnly(enabled, onSuccess, onError) {
if (enabled && Object.values(this.members).filter(m => !m.isFocus).length) {
// first grant membership to all that are in the room
// currently there is a bug in prosody where it handles only the first item
// that's why we will send iq per member
Object.values(this.members).forEach(m => {
if (m.jid && !MEMBERS_AFFILIATIONS.includes(m.affiliation)) {
this.xmpp.connection.sendIQ(
$iq({
to: this.roomjid,
type: 'set' })
.c('query', {
xmlns: 'http://jabber.org/protocol/muc#admin' })
.c('item', {
'affiliation': 'member',
'jid': m.jid
}).up().up());
}
});
}
const errorCallback = onError ? onError : () => {}; // eslint-disable-line no-empty-function
this.xmpp.connection.sendIQ(
$iq({
to: this.roomjid,
type: 'get'
}).c('query', { xmlns: 'http://jabber.org/protocol/muc#owner' }),
res => {
if ($(res).find('>query>x[xmlns="jabber:x:data"]>field[var="muc#roomconfig_membersonly"]').length) {
const formToSubmit
= $iq({
to: this.roomjid,
type: 'set'
}).c('query', { xmlns: 'http://jabber.org/protocol/muc#owner' });
formToSubmit.c('x', {
xmlns: 'jabber:x:data',
type: 'submit'
});
formToSubmit
.c('field', { 'var': 'FORM_TYPE' })
.c('value')
.t('http://jabber.org/protocol/muc#roomconfig')
.up()
.up();
formToSubmit
.c('field', { 'var': 'muc#roomconfig_membersonly' })
.c('value')
.t(enabled ? 'true' : 'false')
.up()
.up();
// if room is locked from other participant or we are locking it
if (this.locked) {
formToSubmit
.c('field',
{ 'var': 'muc#roomconfig_passwordprotectedroom' })
.c('value')
.t('1')
.up()
.up();
}
this.xmpp.connection.sendIQ(formToSubmit, onSuccess, errorCallback);
} else {
errorCallback(new Error('Setting members only room not supported!'));
}
},
errorCallback);
}
/**
* Adds the key to the presence map, overriding any previous value.
* @param key

@@ -1182,6 +1400,7 @@ * @param values

this.presMap.nodes.push(values);
this.presenceUpdateTime = Date.now();
}
/**
* Retreives a value from the presence map.
* Retrieves a value from the presence map.
*

@@ -1196,3 +1415,3 @@ * @param {string} key - The key to find the value for.

/**
*
* Removes a key from the presence map.
* @param key

@@ -1204,2 +1423,3 @@ */

this.presMap.nodes = nodes;
this.presenceUpdateTime = Date.now();
}

@@ -1309,3 +1529,2 @@

addAudioInfoToPresence(mute) {
this.removeFromPresence('audiomuted');
this.addToPresence(

@@ -1339,3 +1558,2 @@ 'audiomuted',

addVideoInfoToPresence(mute) {
this.removeFromPresence('videomuted');
this.addToPresence(

@@ -1432,2 +1650,10 @@ 'videomuted',

/**
*
* @returns {Lobby}
*/
getLobby() {
return this.lobby;
}
/**
* Returns the phone number for joining the conference.

@@ -1504,2 +1730,12 @@ */

/**
* Clean any listeners or resources, executed on leaving.
*/
clean() {
this._removeConnListeners.forEach(remove => remove());
this._removeConnListeners = [];
this.joined = false;
}
/**
* Leaves the room. Closes the jingle session.

@@ -1515,4 +1751,3 @@ * @returns {Promise} which is resolved if XMPPEvents.MUC_LEFT is received

this._removeConnListeners.forEach(remove => remove());
this._removeConnListeners = [];
this.clean();

@@ -1519,0 +1754,0 @@ /**

@@ -1,5 +0,7 @@

import ChatRoom, { parser } from './ChatRoom';
import { $pres } from 'strophe.js';
import XMPPEvents from '../../service/xmpp/XMPPEvents';
import ChatRoom, { parser } from './ChatRoom';
// This rule makes creating the xml elements take up way more

@@ -175,3 +177,4 @@ // space than necessary.

undefined,
undefined
undefined,
'fulljid'
]);

@@ -204,3 +207,4 @@ });

undefined,
undefined);
undefined,
'jid=attr');
});

@@ -250,3 +254,4 @@

expectedIdentity,
undefined
undefined,
'fulljid'
]);

@@ -282,3 +287,4 @@ });

undefined,
expectedBotType
expectedBotType,
'fulljid'
]);

@@ -285,0 +291,0 @@ });

/* global __filename */
import { getLogger } from 'jitsi-meet-logger';
import Listenable from '../util/Listenable';
import * as JingleSessionState from './JingleSessionState';

@@ -12,3 +15,3 @@

*/
export default class JingleSession {
export default class JingleSession extends Listenable {

@@ -22,3 +25,3 @@ /* eslint-disable max-params */

* @param {string} remoteJid the JID of the remote peer
* @param {Strophe.Connection} connection the XMPP connection
* @param {XmppConnection} connection the XMPP connection
* @param {Object} mediaConstraints the media constraints object passed to

@@ -39,2 +42,3 @@ * the PeerConnection onCreateAnswer/Offer as defined by the WebRTC.

isInitiator) {
super();
this.sid = sid;

@@ -179,2 +183,4 @@ this.localJid = localJid;

* @param {string} [options.reasonDescription] some meaningful error message
* @param {boolean} [options.requestRestart=false] set to true to ask Jicofo to start a new session one this once is
* terminated.
* @param {boolean} [options.sendSessionTerminate=true] set to false to skip

@@ -181,0 +187,0 @@ * sending session-terminate. It may not make sense to send it if the XMPP

/* global $, Promise */
const logger = require('jitsi-meet-logger').getLogger(__filename);
import { getLogger } from 'jitsi-meet-logger';
import { $iq, Strophe } from 'strophe.js';
const XMPPEvents = require('../../service/xmpp/XMPPEvents');
import Settings from '../settings/Settings';
const AuthenticationEvents
= require('../../service/authentication/AuthenticationEvents');
const XMPPEvents = require('../../service/xmpp/XMPPEvents');
const GlobalOnErrorHandler = require('../util/GlobalOnErrorHandler');
import Settings from '../settings/Settings';
const logger = getLogger(__filename);

@@ -206,2 +207,9 @@ /**

}
if (config.enableOpusRed === true) {
elem.c(
'property', {
name: 'enableOpusRed',
value: true
}).up();
}
if (config.minParticipants !== undefined) {

@@ -218,3 +226,3 @@ elem.c(

name: 'enableLipSync',
value: this.options.connection.enableLipSync !== false
value: this.options.connection.enableLipSync === true
}).up();

@@ -272,2 +280,9 @@ if (config.audioPacketDelay !== undefined) {

if (config.opusMaxAverageBitrate) {
elem.c(
'property', {
name: 'opusMaxAverageBitrate',
value: config.opusMaxAverageBitrate
}).up();
}
if (this.options.conference.startAudioMuted !== undefined) {

@@ -274,0 +289,0 @@ elem.c(

/* global __filename */
import { getLogger } from 'jitsi-meet-logger';
import SDPUtil from './SDPUtil';
import { parseSecondarySSRC, SdpTransformWrap } from './SdpTransformUtil';
import SDPUtil from './SDPUtil';

@@ -7,0 +8,0 @@ const logger = getLogger(__filename);

/* eslint-disable max-len*/
/* eslint-disable no-invalid-this */
import * as transform from 'sdp-transform';
import RtxModifier from './RtxModifier.js';
import * as SampleSdpStrings from './SampleSdpStrings.js';
import * as transform from 'sdp-transform';
import SDPUtil from './SDPUtil';
import { default as SampleSdpStrings } from './SampleSdpStrings.js';

@@ -79,6 +79,6 @@ /**

describe('RtxModifier', () => {
beforeEach(function() {
this.rtxModifier = new RtxModifier();
this.transform = transform;
this.SDPUtil = SDPUtil;
let rtxModifier;
beforeEach(() => {
rtxModifier = new RtxModifier();
});

@@ -88,14 +88,16 @@

describe('when given an sdp with a single video ssrc', () => {
beforeEach(function() {
this.singleVideoSdp = SampleSdpStrings.plainVideoSdp;
this.primaryVideoSsrc = getPrimaryVideoSsrc(this.singleVideoSdp);
let primaryVideoSsrc, singleVideoSdp;
beforeEach(() => {
singleVideoSdp = SampleSdpStrings.plainVideoSdp;
primaryVideoSsrc = getPrimaryVideoSsrc(singleVideoSdp);
});
it('should add a single rtx ssrc', function() {
it('should add a single rtx ssrc', () => {
// Call rtxModifier.modifyRtxSsrcs with an sdp that contains a single video
// ssrc. The returned sdp should have an rtx ssrc and an fid group.
const newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.singleVideoSdp));
const newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(singleVideoSdp));
const newSdp = transform.parse(newSdpStr);
const newPrimaryVideoSsrc = getPrimaryVideoSsrc(newSdp);
expect(newPrimaryVideoSsrc).toEqual(this.primaryVideoSsrc);
expect(newPrimaryVideoSsrc).toEqual(primaryVideoSsrc);

@@ -113,10 +115,10 @@ // Should now have an rtx ssrc as well

expect(fidGroupPrimarySsrc).toEqual(this.primaryVideoSsrc);
expect(fidGroupPrimarySsrc).toEqual(primaryVideoSsrc);
});
it('should re-use the same rtx ssrc for a primary ssrc it\'s seen before', function() {
it('should re-use the same rtx ssrc for a primary ssrc it\'s seen before', () => {
// Have rtxModifier generate an rtx ssrc via modifyRtxSsrcs. Then call it again
// with the same primary ssrc in the sdp (but no rtx ssrc). It should use
// the same rtx ssrc as before.
let newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.singleVideoSdp));
let newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(singleVideoSdp));
let newSdp = transform.parse(newSdpStr);

@@ -128,3 +130,3 @@

// Now pass the original sdp through again
newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.singleVideoSdp));
newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(singleVideoSdp));
newSdp = transform.parse(newSdpStr);

@@ -137,3 +139,3 @@ fidGroup = getVideoGroups(newSdp, 'FID')[0];

it('should NOT re-use the same rtx ssrc for a primary ssrc it\'s seen before if the cache has been cleared', function() {
it('should NOT re-use the same rtx ssrc for a primary ssrc it\'s seen before if the cache has been cleared', () => {
// Call modifyRtxSsrcs to generate an rtx ssrc

@@ -143,3 +145,3 @@ // Clear the rtxModifier cache

// --> We should get a different rtx ssrc
let newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.singleVideoSdp));
let newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(singleVideoSdp));
let newSdp = transform.parse(newSdpStr);

@@ -150,6 +152,6 @@

this.rtxModifier.clearSsrcCache();
rtxModifier.clearSsrcCache();
// Now pass the original sdp through again
newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.singleVideoSdp));
newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(singleVideoSdp));
newSdp = transform.parse(newSdpStr);

@@ -162,3 +164,3 @@ fidGroup = getVideoGroups(newSdp, 'FID')[0];

it('should use the rtx ssrc from the cache when the cache has been manually set', function() {
it('should use the rtx ssrc from the cache when the cache has been manually set', () => {
// Manually set an rtx ssrc mapping in the cache

@@ -170,5 +172,5 @@ // Call modifyRtxSsrcs

ssrcCache.set(this.primaryVideoSsrc, forcedRtxSsrc);
this.rtxModifier.setSsrcCache(ssrcCache);
const newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.singleVideoSdp));
ssrcCache.set(primaryVideoSsrc, forcedRtxSsrc);
rtxModifier.setSsrcCache(ssrcCache);
const newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(singleVideoSdp));
const newSdp = transform.parse(newSdpStr);

@@ -184,18 +186,20 @@

describe('when given an sdp with multiple video ssrcs', () => {
beforeEach(function() {
this.multipleVideoSdp = SampleSdpStrings.simulcastSdp;
this.primaryVideoSsrcs = getPrimaryVideoSsrcs(this.multipleVideoSdp);
let multipleVideoSdp, primaryVideoSsrcs;
beforeEach(() => {
multipleVideoSdp = SampleSdpStrings.simulcastSdp;
primaryVideoSsrcs = getPrimaryVideoSsrcs(multipleVideoSdp);
});
it('should add rtx ssrcs for all of them', function() {
it('should add rtx ssrcs for all of them', () => {
// Call rtxModifier.modifyRtxSsrcs with an sdp that contains multiple video
// ssrcs. The returned sdp should have an rtx ssrc and an fid group for all of them.
const newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.multipleVideoSdp));
const newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(multipleVideoSdp));
const newSdp = transform.parse(newSdpStr);
const newPrimaryVideoSsrcs = getPrimaryVideoSsrcs(newSdp);
expect(newPrimaryVideoSsrcs).toEqual(this.primaryVideoSsrcs);
expect(newPrimaryVideoSsrcs).toEqual(primaryVideoSsrcs);
// Should now have rtx ssrcs as well
expect(numVideoSsrcs(newSdp)).toEqual(this.primaryVideoSsrcs.length * 2);
expect(numVideoSsrcs(newSdp)).toEqual(primaryVideoSsrcs.length * 2);

@@ -205,15 +209,15 @@ // Should now have FID groups

expect(fidGroups.length).toEqual(this.primaryVideoSsrcs.length);
expect(fidGroups.length).toEqual(primaryVideoSsrcs.length);
fidGroups.forEach(fidGroup => {
const fidGroupPrimarySsrc = SDPUtil.parseGroupSsrcs(fidGroup)[0];
expect(this.primaryVideoSsrcs.indexOf(fidGroupPrimarySsrc)).not.toEqual(-1);
expect(primaryVideoSsrcs.indexOf(fidGroupPrimarySsrc)).not.toEqual(-1);
});
});
it('should re-use the same rtx ssrcs for any primary ssrc it\'s seen before', function() {
it('should re-use the same rtx ssrcs for any primary ssrc it\'s seen before', () => {
// Have rtxModifier generate an rtx ssrc via modifyRtxSsrcs. Then call it again
// with the same primary ssrc in the sdp (but no rtx ssrc). It should use
// the same rtx ssrc as before.
let newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.multipleVideoSdp));
let newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(multipleVideoSdp));
let newSdp = transform.parse(newSdpStr);

@@ -235,3 +239,3 @@

// Now pass the original sdp through again and make sure we get the same mapping
newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.multipleVideoSdp));
newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(multipleVideoSdp));
newSdp = transform.parse(newSdpStr);

@@ -249,3 +253,3 @@ fidGroups = getVideoGroups(newSdp, 'FID');

it('should NOT re-use the same rtx ssrcs for any primary ssrc it\'s seen before if the cache has been cleared', function() {
it('should NOT re-use the same rtx ssrcs for any primary ssrc it\'s seen before if the cache has been cleared', () => {
// Call modifyRtxSsrcs to generate an rtx ssrc

@@ -255,3 +259,3 @@ // Clear the rtxModifier cache

// --> We should get different rtx ssrcs
let newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.multipleVideoSdp));
let newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(multipleVideoSdp));
let newSdp = transform.parse(newSdpStr);

@@ -272,6 +276,6 @@

this.rtxModifier.clearSsrcCache();
rtxModifier.clearSsrcCache();
// Now pass the original sdp through again and make sure we get the same mapping
newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.multipleVideoSdp));
newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(multipleVideoSdp));
newSdp = transform.parse(newSdpStr);

@@ -289,3 +293,3 @@ fidGroups = getVideoGroups(newSdp, 'FID');

it('should use the rtx ssrcs from the cache when the cache has been manually set', function() {
it('should use the rtx ssrcs from the cache when the cache has been manually set', () => {
// Manually set an rtx ssrc mapping in the cache

@@ -296,8 +300,8 @@ // Call modifyRtxSsrcs

this.primaryVideoSsrcs.forEach(ssrc => {
primaryVideoSsrcs.forEach(ssrc => {
rtxMapping.set(ssrc, SDPUtil.generateSsrc());
});
this.rtxModifier.setSsrcCache(rtxMapping);
rtxModifier.setSsrcCache(rtxMapping);
const newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(this.multipleVideoSdp));
const newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(multipleVideoSdp));
const newSdp = transform.parse(newSdpStr);

@@ -319,5 +323,5 @@

describe('when given an sdp with a flexfec stream', () => {
it('should not add rtx for the flexfec ssrc', function() {
it('should not add rtx for the flexfec ssrc', () => {
const flexFecSdp = SampleSdpStrings.flexFecSdp;
const newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(flexFecSdp));
const newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(flexFecSdp));
const newSdp = transform.parse(newSdpStr);

@@ -331,3 +335,3 @@ const fidGroups = getVideoGroups(newSdp, 'FID');

describe('(corner cases)', () => {
it('should handle a recvonly video mline', function() {
it('should handle a recvonly video mline', () => {
const sdp = SampleSdpStrings.plainVideoSdp;

@@ -337,8 +341,8 @@ const videoMLine = sdp.media.find(m => m.type === 'video');

videoMLine.direction = 'recvonly';
const newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(sdp));
const newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(sdp));
expect(newSdpStr).toEqual(this.transform.write(sdp));
expect(newSdpStr).toEqual(transform.write(sdp));
});
it('should handle a video mline with no video ssrcs', function() {
it('should handle a video mline with no video ssrcs', () => {
const sdp = SampleSdpStrings.plainVideoSdp;

@@ -348,5 +352,5 @@ const videoMLine = sdp.media.find(m => m.type === 'video');

videoMLine.ssrcs = [];
const newSdpStr = this.rtxModifier.modifyRtxSsrcs(this.transform.write(sdp));
const newSdpStr = rtxModifier.modifyRtxSsrcs(transform.write(sdp));
expect(newSdpStr).toEqual(this.transform.write(sdp));
expect(newSdpStr).toEqual(transform.write(sdp));
});

@@ -358,5 +362,5 @@ });

beforeEach(() => { }); // eslint-disable-line no-empty-function
it('should strip all rtx streams from an sdp with rtx', function() {
it('should strip all rtx streams from an sdp with rtx', () => {
const sdpStr = transform.write(SampleSdpStrings.rtxVideoSdp);
const newSdpStr = this.rtxModifier.stripRtx(sdpStr);
const newSdpStr = rtxModifier.stripRtx(sdpStr);
const newSdp = transform.parse(newSdpStr);

@@ -368,5 +372,5 @@ const fidGroups = getVideoGroups(newSdp, 'FID');

});
it('should do nothing to an sdp with no rtx', function() {
it('should do nothing to an sdp with no rtx', () => {
const sdpStr = transform.write(SampleSdpStrings.plainVideoSdp);
const newSdpStr = this.rtxModifier.stripRtx(sdpStr);
const newSdpStr = rtxModifier.stripRtx(sdpStr);

@@ -378,3 +382,2 @@ expect(newSdpStr).toEqual(sdpStr);

/* eslint-enable no-invalid-this */
/* eslint-enable max-len*/

@@ -73,14 +73,39 @@ /* eslint-disable max-len*/

const multiCodecVideoMLine = ''
+ 'm=video 9 RTP/SAVPF 100 126 97\r\n'
+ 'm=video 9 RTP/SAVPF 96 97 98 99 102 121 127 120\r\n'
+ 'c=IN IP4 0.0.0.0\r\n'
+ 'a=rtpmap:100 VP8/90000\r\n'
+ 'a=rtpmap:126 H264/90000\r\n'
+ 'a=rtpmap:97 H264/90000\r\n'
+ 'a=rtpmap:96 VP8/90000\r\n'
+ 'a=rtpmap:97 rtx/90000\r\n'
+ 'a=rtpmap:98 VP9/90000\r\n'
+ 'a=rtpmap:99 rtx/90000\r\n'
+ 'a=rtpmap:102 H264/90000\r\n'
+ 'a=rtpmap:121 rtx/90000\r\n'
+ 'a=rtpmap:127 H264/90000\r\n'
+ 'a=rtpmap:120 rtx/90000\r\n'
+ 'a=rtcp:9 IN IP4 0.0.0.0\r\n'
+ 'a=rtcp-fb:100 ccm fir\r\n'
+ 'a=rtcp-fb:100 nack\r\n'
+ 'a=rtcp-fb:100 nack pli\r\n'
+ 'a=rtcp-fb:100 goog-remb\r\n'
+ 'a=fmtp:126 profile-level-id=42e01f;level-asymmetry-allowed=1;packetization-mode=1\r\n'
+ 'a=fmtp:97 profile-level-id=42e01f;level-asymmetry-allowed=1\r\n'
+ 'a=rtcp-fb:96 ccm fir\r\n'
+ 'a=rtcp-fb:96 transport-cc\r\n'
+ 'a=rtcp-fb:96 nack\r\n'
+ 'a=rtcp-fb:96 nack pli\r\n'
+ 'a=rtcp-fb:96 goog-remb\r\n'
+ 'a=rtcp-fb:98 ccm fir\r\n'
+ 'a=rtcp-fb:98 transport-cc\r\n'
+ 'a=rtcp-fb:98 nack\r\n'
+ 'a=rtcp-fb:98 nack pli\r\n'
+ 'a=rtcp-fb:98 goog-remb\r\n'
+ 'a=rtcp-fb:102 ccm fir\r\n'
+ 'a=rtcp-fb:102 transport-cc\r\n'
+ 'a=rtcp-fb:102 nack\r\n'
+ 'a=rtcp-fb:102 nack pli\r\n'
+ 'a=rtcp-fb:102 goog-remb\r\n'
+ 'a=rtcp-fb:127 ccm fir\r\n'
+ 'a=rtcp-fb:127 transport-cc\r\n'
+ 'a=rtcp-fb:127 nack\r\n'
+ 'a=rtcp-fb:127 nack pli\r\n'
+ 'a=rtcp-fb:127 goog-remb\r\n'
+ 'a=fmtp:97 apt=96\r\n'
+ 'a=fmtp:98 profile-id=0\r\n'
+ 'a=fmtp:102 profile-level-id=42e01f;level-asymmetry-allowed=1;packetization-mode=1\r\n'
+ 'a=fmtp:121 apt=102\r\n'
+ 'a=fmtp:127 profile-level-id=42e01f;level-asymmetry-allowed=1:packetization-mode=0\r\n'
+ 'a=fmtp:120 apt=127\r\n'
+ 'a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time\r\n'

@@ -266,9 +291,28 @@ + 'a=setup:passive\r\n'

export const simulcastSdp = transform.parse(simulcastSdpStr);
export const simulcastRtxSdp = transform.parse(simulcastRtxSdpStr);
export const plainVideoSdp = transform.parse(plainVideoSdpStr);
export const rtxVideoSdp = transform.parse(rtxVideoSdpStr);
export const multiCodecVideoSdp = transform.parse(multiCodecVideoSdpStr);
export const flexFecSdp = transform.parse(flexFecSdpStr);
export default {
get simulcastSdp() {
return transform.parse(simulcastSdpStr);
},
get simulcastRtxSdp() {
return transform.parse(simulcastRtxSdpStr);
},
get plainVideoSdp() {
return transform.parse(plainVideoSdpStr);
},
get rtxVideoSdp() {
return transform.parse(rtxVideoSdpStr);
},
get multiCodecVideoSdp() {
return transform.parse(multiCodecVideoSdpStr);
},
get flexFecSdp() {
return transform.parse(flexFecSdpStr);
}
};
/* eslint-enable max-len*/
/* global $ */
import browser from '../browser';
import SDPUtil from './SDPUtil';

@@ -51,11 +53,8 @@

SDP.prototype.getMediaSsrcMap = function() {
const self = this;
const mediaSSRCs = {};
let tmp;
for (let mediaindex = 0; mediaindex < self.media.length; mediaindex++) {
tmp = SDPUtil.findLines(self.media[mediaindex], 'a=ssrc:');
for (let mediaindex = 0; mediaindex < this.media.length; mediaindex++) {
const mid
= SDPUtil.parseMID(
SDPUtil.findLine(self.media[mediaindex], 'a=mid:'));
SDPUtil.findLine(this.media[mediaindex], 'a=mid:'));
const media = {

@@ -69,3 +68,4 @@ mediaindex,

mediaSSRCs[mediaindex] = media;
tmp.forEach(line => {
SDPUtil.findLines(this.media[mediaindex], 'a=ssrc:').forEach(line => {
const linessrc = line.substring(7).split(' ')[0];

@@ -83,4 +83,3 @@

});
tmp = SDPUtil.findLines(self.media[mediaindex], 'a=ssrc-group:');
tmp.forEach(line => {
SDPUtil.findLines(this.media[mediaindex], 'a=ssrc-group:').forEach(line => {
const idx = line.indexOf(' ');

@@ -124,80 +123,20 @@ const semantics = line.substr(0, idx).substr(13);

// remove iSAC and CN from SDP
SDP.prototype.mangle = function() {
let i, j, lines, mline, newdesc, rtpmap;
// add content's to a jingle element
SDP.prototype.toJingle = function(elem, thecreator) {
// https://xmpp.org/extensions/xep-0338.html
SDPUtil.findLines(this.session, 'a=group:').forEach(line => {
const parts = line.split(' ');
const semantics = parts.shift().substr(8);
for (i = 0; i < this.media.length; i++) {
lines = this.media[i].split('\r\n');
lines.pop(); // remove empty last element
mline = SDPUtil.parseMLine(lines.shift());
if (mline.media !== 'audio') {
continue; // eslint-disable-line no-continue
elem.c('group', { xmlns: 'urn:xmpp:jingle:apps:grouping:0',
semantics });
for (let j = 0; j < parts.length; j++) {
elem.c('content', { name: parts[j] }).up();
}
newdesc = '';
mline.fmt.length = 0;
for (j = 0; j < lines.length; j++) {
if (lines[j].substr(0, 9) === 'a=rtpmap:') {
rtpmap = SDPUtil.parseRTPMap(lines[j]);
if (rtpmap.name === 'CN' || rtpmap.name === 'ISAC') {
continue; // eslint-disable-line no-continue
}
mline.fmt.push(rtpmap.id);
}
newdesc += `${lines[j]}\r\n`;
}
this.media[i] = `${SDPUtil.buildMLine(mline)}\r\n${newdesc}`;
}
this.raw = this.session + this.media.join('');
};
// remove lines matching prefix from session section
SDP.prototype.removeSessionLines = function(prefix) {
const self = this;
const lines = SDPUtil.findLines(this.session, prefix);
lines.forEach(line => {
self.session = self.session.replace(`${line}\r\n`, '');
elem.up();
});
this.raw = this.session + this.media.join('');
return lines;
};
for (let i = 0; i < this.media.length; i++) {
const mline = SDPUtil.parseMLine(this.media[i].split('\r\n')[0]);
// remove lines matching prefix from a media section specified by mediaindex
// TODO: non-numeric mediaindex could match mid
SDP.prototype.removeMediaLines = function(mediaindex, prefix) {
const self = this;
const lines = SDPUtil.findLines(this.media[mediaindex], prefix);
lines.forEach(line => {
self.media[mediaindex]
= self.media[mediaindex].replace(`${line}\r\n`, '');
});
this.raw = this.session + this.media.join('');
return lines;
};
// add content's to a jingle element
SDP.prototype.toJingle = function(elem, thecreator) {
let i, j, k, lines, mline, rtpmap, ssrc, tmp;
// new bundle plan
lines = SDPUtil.findLines(this.session, 'a=group:');
if (lines.length) {
for (i = 0; i < lines.length; i++) {
tmp = lines[i].split(' ');
const semantics = tmp.shift().substr(8);
elem.c('group', { xmlns: 'urn:xmpp:jingle:apps:grouping:0',
semantics });
for (j = 0; j < tmp.length; j++) {
elem.c('content', { name: tmp[j] }).up();
}
elem.up();
}
}
for (i = 0; i < this.media.length; i++) {
mline = SDPUtil.parseMLine(this.media[i].split('\r\n')[0]);
if (!(mline.media === 'audio'

@@ -208,2 +147,4 @@ || mline.media === 'video'

}
let ssrc;
const assrcline = SDPUtil.findLine(this.media[i], 'a=ssrc:');

@@ -228,3 +169,3 @@

if (SDPUtil.findLine(this.media[i], 'a=rtpmap:').length) {
if (mline.media === 'audio' || mline.media === 'video') {
elem.c('description',

@@ -236,7 +177,8 @@ { xmlns: 'urn:xmpp:jingle:apps:rtp:1',

}
for (j = 0; j < mline.fmt.length; j++) {
rtpmap
for (let j = 0; j < mline.fmt.length; j++) {
const rtpmap
= SDPUtil.findLine(
this.media[i],
`a=rtpmap:${mline.fmt[j]}`);
elem.c('payload-type', SDPUtil.parseRTPMap(rtpmap));

@@ -252,7 +194,7 @@

if (afmtpline) {
tmp = SDPUtil.parseFmtp(afmtpline);
const fmtpParameters = SDPUtil.parseFmtp(afmtpline);
// eslint-disable-next-line max-depth
for (k = 0; k < tmp.length; k++) {
elem.c('parameter', tmp[k]).up();
for (let k = 0; k < fmtpParameters.length; k++) {
elem.c('parameter', fmtpParameters[k]).up();
}

@@ -266,12 +208,3 @@ }

}
const crypto
= SDPUtil.findLines(this.media[i], 'a=crypto:', this.session);
if (crypto.length) {
elem.c('encryption', { required: 1 });
crypto.forEach(
line => elem.c('crypto', SDPUtil.parseCrypto(line)).up());
elem.up(); // end of encryption
}
if (ssrc) {

@@ -330,3 +263,3 @@ const ssrcMap = SDPUtil.parseSSRC(this.media[i]);

if (ridLines.length) {
if (ridLines.length && browser.usesRidsForSimulcast()) {
// Map a line which looks like "a=rid:2 send" to just

@@ -368,35 +301,35 @@ // the rid ("2")

// XEP-0294
lines = SDPUtil.findLines(this.media[i], 'a=extmap:');
if (lines.length) {
for (j = 0; j < lines.length; j++) {
tmp = SDPUtil.parseExtmap(lines[j]);
elem.c('rtp-hdrext', {
xmlns: 'urn:xmpp:jingle:apps:rtp:rtp-hdrext:0',
uri: tmp.uri,
id: tmp.value
});
const extmapLines = SDPUtil.findLines(this.media[i], 'a=extmap:');
for (let j = 0; j < extmapLines.length; j++) {
const extmap = SDPUtil.parseExtmap(extmapLines[j]);
elem.c('rtp-hdrext', {
xmlns: 'urn:xmpp:jingle:apps:rtp:rtp-hdrext:0',
uri: extmap.uri,
id: extmap.value
});
// eslint-disable-next-line max-depth
if (extmap.hasOwnProperty('direction')) {
// eslint-disable-next-line max-depth
if (tmp.hasOwnProperty('direction')) {
// eslint-disable-next-line max-depth
switch (tmp.direction) {
case 'sendonly':
elem.attrs({ senders: 'responder' });
break;
case 'recvonly':
elem.attrs({ senders: 'initiator' });
break;
case 'sendrecv':
elem.attrs({ senders: 'both' });
break;
case 'inactive':
elem.attrs({ senders: 'none' });
break;
}
switch (extmap.direction) {
case 'sendonly':
elem.attrs({ senders: 'responder' });
break;
case 'recvonly':
elem.attrs({ senders: 'initiator' });
break;
case 'sendrecv':
elem.attrs({ senders: 'both' });
break;
case 'inactive':
elem.attrs({ senders: 'none' });
break;
}
}
// TODO: handle params
elem.up();
}
// TODO: handle params
elem.up();
}

@@ -420,3 +353,6 @@ elem.up(); // end of description

}
if (mline.port === '0') {
// Reject an m-line only when port is 0 and a=bundle-only is not present in the section.
// The port is automatically set to 0 when bundle-only is used.
if (mline.port === '0' && !SDPUtil.findLine(m, 'a=bundle-only', this.session)) {
// estos hack to reject an m-line

@@ -433,5 +369,2 @@ elem.attrs({ senders: 'rejected' });

SDP.prototype.transportToJingle = function(mediaindex, elem) {
let tmp;
const self = this;
elem.c('transport');

@@ -441,3 +374,3 @@

const sctpmap
= SDPUtil.findLine(this.media[mediaindex], 'a=sctpmap:', self.session);
= SDPUtil.findLine(this.media[mediaindex], 'a=sctpmap:', this.session);

@@ -468,26 +401,28 @@ if (sctpmap) {

fingerprints.forEach(line => {
tmp = SDPUtil.parseFingerprint(line);
tmp.xmlns = 'urn:xmpp:jingle:apps:dtls:0';
elem.c('fingerprint').t(tmp.fingerprint);
delete tmp.fingerprint;
const fingerprint = SDPUtil.parseFingerprint(line);
// eslint-disable-next-line no-param-reassign
line
fingerprint.xmlns = 'urn:xmpp:jingle:apps:dtls:0';
elem.c('fingerprint').t(fingerprint.fingerprint);
delete fingerprint.fingerprint;
const setupLine
= SDPUtil.findLine(
self.media[mediaindex],
this.media[mediaindex],
'a=setup:',
self.session);
if (line) {
tmp.setup = line.substr(8);
this.session);
if (setupLine) {
fingerprint.setup = setupLine.substr(8);
}
elem.attrs(tmp);
elem.attrs(fingerprint);
elem.up(); // end of fingerprint
});
tmp = SDPUtil.iceparams(this.media[mediaindex], this.session);
if (tmp) {
tmp.xmlns = 'urn:xmpp:jingle:transports:ice-udp:1';
elem.attrs(tmp);
const iceParameters = SDPUtil.iceparams(this.media[mediaindex], this.session);
if (iceParameters) {
iceParameters.xmlns = 'urn:xmpp:jingle:transports:ice-udp:1';
elem.attrs(iceParameters);
// XEP-0176
const lines
const candidateLines
= SDPUtil.findLines(

@@ -498,22 +433,20 @@ this.media[mediaindex],

if (lines.length) { // add any a=candidate lines
lines.forEach(line => {
const candidate = SDPUtil.candidateToJingle(line);
candidateLines.forEach(line => { // add any a=candidate lines
const candidate = SDPUtil.candidateToJingle(line);
if (self.failICE) {
candidate.ip = '1.1.1.1';
}
const protocol
= candidate && typeof candidate.protocol === 'string'
? candidate.protocol.toLowerCase()
: '';
if (this.failICE) {
candidate.ip = '1.1.1.1';
}
const protocol
= candidate && typeof candidate.protocol === 'string'
? candidate.protocol.toLowerCase()
: '';
if ((self.removeTcpCandidates
&& (protocol === 'tcp' || protocol === 'ssltcp'))
|| (self.removeUdpCandidates && protocol === 'udp')) {
return;
}
elem.c('candidate', candidate).up();
});
}
if ((this.removeTcpCandidates
&& (protocol === 'tcp' || protocol === 'ssltcp'))
|| (this.removeUdpCandidates && protocol === 'udp')) {
return;
}
elem.c('candidate', candidate).up();
});
}

@@ -531,8 +464,8 @@ elem.up(); // end of transport

lines.forEach(line => {
const tmp = SDPUtil.parseRTCPFB(line);
const feedback = SDPUtil.parseRTCPFB(line);
if (tmp.type === 'trr-int') {
if (feedback.type === 'trr-int') {
elem.c('rtcp-fb-trr-int', {
xmlns: 'urn:xmpp:jingle:apps:rtp:rtcp-fb:0',
value: tmp.params[0]
value: feedback.params[0]
});

@@ -543,6 +476,6 @@ elem.up();

xmlns: 'urn:xmpp:jingle:apps:rtp:rtcp-fb:0',
type: tmp.type
type: feedback.type
});
if (tmp.params.length > 0) {
elem.attrs({ 'subtype': tmp.params[0] });
if (feedback.params.length > 0) {
elem.attrs({ 'subtype': feedback.params[0] });
}

@@ -555,29 +488,28 @@ elem.up();

SDP.prototype.rtcpFbFromJingle = function(elem, payloadtype) { // XEP-0293
let media = '';
let tmp
let sdp = '';
const feedbackElementTrrInt
= elem.find(
'>rtcp-fb-trr-int[xmlns="urn:xmpp:jingle:apps:rtp:rtcp-fb:0"]');
if (tmp.length) {
media += 'a=rtcp-fb:* trr-int ';
if (tmp.attr('value')) {
media += tmp.attr('value');
if (feedbackElementTrrInt.length) {
sdp += 'a=rtcp-fb:* trr-int ';
if (feedbackElementTrrInt.attr('value')) {
sdp += feedbackElementTrrInt.attr('value');
} else {
media += '0';
sdp += '0';
}
media += '\r\n';
sdp += '\r\n';
}
tmp = elem.find('>rtcp-fb[xmlns="urn:xmpp:jingle:apps:rtp:rtcp-fb:0"]');
tmp.each(function() {
/* eslint-disable no-invalid-this */
media += `a=rtcp-fb:${payloadtype} ${$(this).attr('type')}`;
if ($(this).attr('subtype')) {
media += ` ${$(this).attr('subtype')}`;
const feedbackElements = elem.find('>rtcp-fb[xmlns="urn:xmpp:jingle:apps:rtp:rtcp-fb:0"]');
feedbackElements.each((_, fb) => {
sdp += `a=rtcp-fb:${payloadtype} ${fb.getAttribute('type')}`;
if (fb.hasAttribute('subtype')) {
sdp += ` ${fb.getAttribute('subtype')}`;
}
media += '\r\n';
/* eslint-enable no-invalid-this */
sdp += '\r\n';
});
return media;
return sdp;
};

@@ -587,6 +519,7 @@

SDP.prototype.fromJingle = function(jingle) {
const self = this;
const sessionId = Date.now();
// Use a unique session id for every TPC.
this.raw = 'v=0\r\n'
+ 'o=- 1923518516 2 IN IP4 0.0.0.0\r\n'// FIXME
+ `o=- ${sessionId} 2 IN IP4 0.0.0.0\r\n`
+ 's=-\r\n'

@@ -609,3 +542,3 @@ + 't=0 0\r\n';

if (contents.length > 0) {
self.raw
this.raw
+= `a=group:${

@@ -620,7 +553,6 @@ group.getAttribute('semantics')

this.session = this.raw;
jingle.find('>content').each(function() {
// eslint-disable-next-line no-invalid-this
const m = self.jingle2media($(this));
jingle.find('>content').each((_, content) => {
const m = this.jingle2media($(content));
self.media.push(m);
this.media.push(m);
});

@@ -641,25 +573,24 @@

SDP.prototype.jingle2media = function(content) {
const desc = content.find('description');
let media = '';
const self = this;
const sctp = content.find(
'>transport>sctpmap[xmlns="urn:xmpp:jingle:transports:dtls-sctp:1"]');
const desc = content.find('>description');
const transport = content.find('>transport[xmlns="urn:xmpp:jingle:transports:ice-udp:1"]');
let sdp = '';
const sctp = transport.find(
'>sctpmap[xmlns="urn:xmpp:jingle:transports:dtls-sctp:1"]');
let tmp = { media: desc.attr('media') };
const media = { media: desc.attr('media') };
tmp.port = '1';
media.port = '1';
if (content.attr('senders') === 'rejected') {
// estos hack to reject an m-line.
tmp.port = '0';
media.port = '0';
}
if (content.find('>transport>fingerprint').length
|| desc.find('encryption').length) {
tmp.proto = sctp.length ? 'DTLS/SCTP' : 'RTP/SAVPF';
if (transport.find('>fingerprint[xmlns="urn:xmpp:jingle:apps:dtls:0"]').length) {
media.proto = sctp.length ? 'DTLS/SCTP' : 'RTP/SAVPF';
} else {
tmp.proto = 'RTP/AVPF';
media.proto = 'RTP/AVPF';
}
if (sctp.length) {
media += `m=application ${tmp.port} DTLS/SCTP ${
sdp += `m=application ${media.port} DTLS/SCTP ${
sctp.attr('number')}\r\n`;
media += `a=sctpmap:${sctp.attr('number')} ${sctp.attr('protocol')}`;
sdp += `a=sctpmap:${sctp.attr('number')} ${sctp.attr('protocol')}`;

@@ -669,60 +600,72 @@ const streamCount = sctp.attr('streams');

if (streamCount) {
media += ` ${streamCount}\r\n`;
sdp += ` ${streamCount}\r\n`;
} else {
media += '\r\n';
sdp += '\r\n';
}
} else {
tmp.fmt
media.fmt
= desc
.find('payload-type')
.map(function() {
// eslint-disable-next-line no-invalid-this
return this.getAttribute('id');
})
.find('>payload-type')
.map((_, payloadType) => payloadType.getAttribute('id'))
.get();
media += `${SDPUtil.buildMLine(tmp)}\r\n`;
sdp += `${SDPUtil.buildMLine(media)}\r\n`;
}
media += 'c=IN IP4 0.0.0.0\r\n';
sdp += 'c=IN IP4 0.0.0.0\r\n';
if (!sctp.length) {
media += 'a=rtcp:1 IN IP4 0.0.0.0\r\n';
sdp += 'a=rtcp:1 IN IP4 0.0.0.0\r\n';
}
tmp
= content.find(
'>transport[xmlns="urn:xmpp:jingle:transports:ice-udp:1"]');
if (tmp.length) {
if (tmp.attr('ufrag')) {
media += `${SDPUtil.buildICEUfrag(tmp.attr('ufrag'))}\r\n`;
// XEP-0176 ICE parameters
if (transport.length) {
if (transport.attr('ufrag')) {
sdp += `${SDPUtil.buildICEUfrag(transport.attr('ufrag'))}\r\n`;
}
if (tmp.attr('pwd')) {
media += `${SDPUtil.buildICEPwd(tmp.attr('pwd'))}\r\n`;
if (transport.attr('pwd')) {
sdp += `${SDPUtil.buildICEPwd(transport.attr('pwd'))}\r\n`;
}
tmp.find('>fingerprint').each(function() {
/* eslint-disable no-invalid-this */
// FIXME: check namespace at some point
media += `a=fingerprint:${this.getAttribute('hash')}`;
media += ` ${$(this).text()}`;
media += '\r\n';
if (this.getAttribute('setup')) {
media += `a=setup:${this.getAttribute('setup')}\r\n`;
transport.find('>fingerprint[xmlns="urn:xmpp:jingle:apps:dtls:0"]').each((_, fingerprint) => {
sdp += `a=fingerprint:${fingerprint.getAttribute('hash')}`;
sdp += ` ${$(fingerprint).text()}`;
sdp += '\r\n';
if (fingerprint.hasAttribute('setup')) {
sdp += `a=setup:${fingerprint.getAttribute('setup')}\r\n`;
}
});
}
/* eslint-enable no-invalid-this */
// XEP-0176 ICE candidates
transport.find('>candidate')
.each((_, candidate) => {
let protocol = candidate.getAttribute('protocol');
protocol
= typeof protocol === 'string' ? protocol.toLowerCase() : '';
if ((this.removeTcpCandidates
&& (protocol === 'tcp' || protocol === 'ssltcp'))
|| (this.removeUdpCandidates && protocol === 'udp')) {
return;
} else if (this.failICE) {
candidate.setAttribute('ip', '1.1.1.1');
}
sdp += SDPUtil.candidateFromJingle(candidate);
});
}
switch (content.attr('senders')) {
case 'initiator':
media += 'a=sendonly\r\n';
sdp += 'a=sendonly\r\n';
break;
case 'responder':
media += 'a=recvonly\r\n';
sdp += 'a=recvonly\r\n';
break;
case 'none':
media += 'a=inactive\r\n';
sdp += 'a=inactive\r\n';
break;
case 'both':
media += 'a=sendrecv\r\n';
sdp += 'a=sendrecv\r\n';
break;
}
media += `a=mid:${content.attr('name')}\r\n`;
sdp += `a=mid:${content.attr('name')}\r\n`;

@@ -733,131 +676,79 @@ // <description><rtcp-mux/></description>

// and http://mail.jabber.org/pipermail/jingle/2011-December/001761.html
if (desc.find('rtcp-mux').length) {
media += 'a=rtcp-mux\r\n';
if (desc.find('>rtcp-mux').length) {
sdp += 'a=rtcp-mux\r\n';
}
if (desc.find('encryption').length) {
desc.find('encryption>crypto').each(function() {
/* eslint-disable no-invalid-this */
media += `a=crypto:${this.getAttribute('tag')}`;
media += ` ${this.getAttribute('crypto-suite')}`;
media += ` ${this.getAttribute('key-params')}`;
if (this.getAttribute('session-params')) {
media += ` ${this.getAttribute('session-params')}`;
}
media += '\r\n';
desc.find('>payload-type').each((_, payloadType) => {
sdp += `${SDPUtil.buildRTPMap(payloadType)}\r\n`;
if ($(payloadType).find('>parameter').length) {
sdp += `a=fmtp:${payloadType.getAttribute('id')} `;
sdp
+= $(payloadType)
.find('>parameter')
.map((__, parameter) => {
const name = parameter.getAttribute('name');
/* eslint-enable no-invalid-this */
});
}
desc.find('payload-type').each(function() {
/* eslint-disable no-invalid-this */
media += `${SDPUtil.buildRTPMap(this)}\r\n`;
if ($(this).find('>parameter').length) {
media += `a=fmtp:${this.getAttribute('id')} `;
media
+= $(this)
.find('parameter')
.map(function() {
const name = this.getAttribute('name');
return (
(name ? `${name}=` : '')
+ this.getAttribute('value'));
+ parameter.getAttribute('value'));
})
.get()
.join('; ');
media += '\r\n';
sdp += '\r\n';
}
// xep-0293
media += self.rtcpFbFromJingle($(this), this.getAttribute('id'));
/* eslint-enable no-invalid-this */
sdp += this.rtcpFbFromJingle($(payloadType), payloadType.getAttribute('id'));
});
// xep-0293
media += self.rtcpFbFromJingle(desc, '*');
sdp += this.rtcpFbFromJingle(desc, '*');
// xep-0294
tmp
= desc.find(
'>rtp-hdrext[xmlns="urn:xmpp:jingle:apps:rtp:rtp-hdrext:0"]');
tmp.each(function() {
/* eslint-disable no-invalid-this */
media
+= `a=extmap:${this.getAttribute('id')} ${
this.getAttribute('uri')}\r\n`;
/* eslint-enable no-invalid-this */
});
content
.find(
'>transport[xmlns="urn:xmpp:jingle:transports:ice-udp:1"]'
+ '>candidate')
.each(function() {
/* eslint-disable no-invalid-this */
let protocol = this.getAttribute('protocol');
protocol
= typeof protocol === 'string' ? protocol.toLowerCase() : '';
if ((self.removeTcpCandidates
&& (protocol === 'tcp' || protocol === 'ssltcp'))
|| (self.removeUdpCandidates && protocol === 'udp')) {
return;
} else if (self.failICE) {
this.setAttribute('ip', '1.1.1.1');
}
media += SDPUtil.candidateFromJingle(this);
/* eslint-enable no-invalid-this */
desc
.find('>rtp-hdrext[xmlns="urn:xmpp:jingle:apps:rtp:rtp-hdrext:0"]')
.each((_, hdrExt) => {
sdp
+= `a=extmap:${hdrExt.getAttribute('id')} ${
hdrExt.getAttribute('uri')}\r\n`;
});
// XEP-0339 handle ssrc-group attributes
content
.find('description>ssrc-group[xmlns="urn:xmpp:jingle:apps:rtp:ssma:0"]')
.each(function() {
/* eslint-disable no-invalid-this */
const semantics = this.getAttribute('semantics');
desc
.find('>ssrc-group[xmlns="urn:xmpp:jingle:apps:rtp:ssma:0"]')
.each((_, ssrcGroup) => {
const semantics = ssrcGroup.getAttribute('semantics');
const ssrcs
= $(this)
= $(ssrcGroup)
.find('>source')
.map(function() {
return this.getAttribute('ssrc');
})
.map((__, source) => source.getAttribute('ssrc'))
.get();
if (ssrcs.length) {
media += `a=ssrc-group:${semantics} ${ssrcs.join(' ')}\r\n`;
sdp += `a=ssrc-group:${semantics} ${ssrcs.join(' ')}\r\n`;
}
/* eslint-enable no-invalid-this */
});
tmp
= content.find(
'description>source[xmlns="urn:xmpp:jingle:apps:rtp:ssma:0"]');
tmp.each(function() {
/* eslint-disable no-invalid-this */
const ssrc = this.getAttribute('ssrc');
// XEP-0339 handle source attributes
desc
.find('>source[xmlns="urn:xmpp:jingle:apps:rtp:ssma:0"]')
.each((_, source) => {
const ssrc = source.getAttribute('ssrc');
// eslint-disable-next-line newline-per-chained-call
$(this).find('>parameter').each(function() {
const name = this.getAttribute('name');
let value = this.getAttribute('value');
$(source)
.find('>parameter')
.each((__, parameter) => {
const name = parameter.getAttribute('name');
let value = parameter.getAttribute('value');
value = SDPUtil.filterSpecialChars(value);
media += `a=ssrc:${ssrc} ${name}`;
if (value && value.length) {
media += `:${value}`;
}
media += '\r\n';
value = SDPUtil.filterSpecialChars(value);
sdp += `a=ssrc:${ssrc} ${name}`;
if (value && value.length) {
sdp += `:${value}`;
}
sdp += '\r\n';
});
});
/* eslint-enable no-invalid-this */
});
return media;
return sdp;
};

@@ -0,4 +1,13 @@

/* globals $ */
import { $iq } from 'strophe.js';
import SDP from './SDP';
/**
* @param {string} xml - raw xml of the stanza
*/
function createStanzaElement(xml) {
return new DOMParser().parseFromString(xml, 'text/xml').documentElement;
}
describe('SDP', () => {

@@ -17,2 +26,3 @@ describe('toJingle', () => {

'a=rtpmap:126 telephone-event/8000\r\n',
'a=fmtp:126 0-15\r\n',
'a=fmtp:111 minptime=10;useinbandfec=1\r\n',

@@ -96,2 +106,145 @@ 'a=rtcp:9 IN IP4 0.0.0.0\r\n',

});
describe('fromJingle', () => {
/* eslint-disable max-len*/
const stanza = `<iq>
<jingle action='session-initiate' initiator='focus' sid='123' xmlns='urn:xmpp:jingle:1'>
<content creator='initiator' name='audio' senders='both'>
<description media='audio' maxptime='60' xmlns='urn:xmpp:jingle:apps:rtp:1'>
<payload-type channels='2' clockrate='48000' name='opus' id='111'>
<parameter name='minptime' value='10'/>
<parameter name='useinbandfec' value='1'/>
<rtcp-fb type='transport-cc' xmlns='urn:xmpp:jingle:apps:rtp:rtcp-fb:0'/>
</payload-type>
<payload-type clockrate='16000' name='ISAC' id='103'/>
<payload-type clockrate='32000' name='ISAC' id='104'/>
<payload-type clockrate='8000' name='telephone-event' id='126'>
<parameter name="" value="0-15"/>
</payload-type>
<rtp-hdrext uri='urn:ietf:params:rtp-hdrext:ssrc-audio-level' id='1' xmlns='urn:xmpp:jingle:apps:rtp:rtp-hdrext:0'/>
<rtp-hdrext uri='http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01' id='5' xmlns='urn:xmpp:jingle:apps:rtp:rtp-hdrext:0'/>
<rtcp-mux/>
<source ssrc='4039389863' xmlns='urn:xmpp:jingle:apps:rtp:ssma:0'>
<parameter name='cname' value='mixed'/>
<parameter name='label' value='mixedlabelaudio0'/>
<parameter name='msid' value='mixedmslabel mixedlabelaudio0'/>
<parameter name='mslabel' value='mixedmslabel'/>
</source>
</description>
<transport ufrag='someufrag' pwd='somepwd' xmlns='urn:xmpp:jingle:transports:ice-udp:1'>
<fingerprint hash='sha-256' required='false' setup='actpass' xmlns='urn:xmpp:jingle:apps:dtls:0'>09:B1:51:0F:85:4C:80:19:A1:AF:81:73:47:EE:ED:3D:00:3A:84:C7:76:C1:4E:34:BE:56:F6:42:AD:15:D5:D7</fingerprint>
<candidate foundation='1' id='3cbe5aea5bde0c1401a60bbc2' network='0' protocol='udp' generation='0' port='10000' priority='2130706431' type='host' ip='10.0.0.1' component='1'/>
<candidate rel-addr='10.0.0.1' network='0' foundation='2' id='dfcfd075bde0c140ffffffff927646ba' port='10000' protocol='udp' generation='0' rel-port='10000' priority='1694498815' type='srflx' ip='10.0.0.2' component='1'/>
</transport>
</content>
<content creator='initiator' name='video' senders='both'>
<description media='video' xmlns='urn:xmpp:jingle:apps:rtp:1'>
<payload-type clockrate='90000' name='VP8' id='100'>
<rtcp-fb subtype='fir' type='ccm' xmlns='urn:xmpp:jingle:apps:rtp:rtcp-fb:0'/>
<rtcp-fb type='nack' xmlns='urn:xmpp:jingle:apps:rtp:rtcp-fb:0'/>
<rtcp-fb subtype='pli' type='nack' xmlns='urn:xmpp:jingle:apps:rtp:rtcp-fb:0'/>
<rtcp-fb type='goog-remb' xmlns='urn:xmpp:jingle:apps:rtp:rtcp-fb:0'/>
<rtcp-fb type='transport-cc' xmlns='urn:xmpp:jingle:apps:rtp:rtcp-fb:0'/>
<parameter name='x-google-start-bitrate' value='800'/>
</payload-type>
<payload-type clockrate='90000' name='rtx' id='96'>
<rtcp-fb subtype='fir' type='ccm' xmlns='urn:xmpp:jingle:apps:rtp:rtcp-fb:0'/>
<rtcp-fb type='nack' xmlns='urn:xmpp:jingle:apps:rtp:rtcp-fb:0'/>
<rtcp-fb subtype='pli' type='nack' xmlns='urn:xmpp:jingle:apps:rtp:rtcp-fb:0'/>
<parameter name='apt' value='100'/>
</payload-type>
<rtp-hdrext uri='http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time' id='3' xmlns='urn:xmpp:jingle:apps:rtp:rtp-hdrext:0'/>
<rtp-hdrext uri='http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01' id='5' xmlns='urn:xmpp:jingle:apps:rtp:rtp-hdrext:0'/>
<rtcp-mux/>
<source ssrc='3758540092' xmlns='urn:xmpp:jingle:apps:rtp:ssma:0'>
<parameter name='cname' value='mixed'/>
<parameter name='label' value='mixedlabelvideo0'/>
<parameter name='msid' value='mixedmslabel mixedlabelvideo0'/>
<parameter name='mslabel' value='mixedmslabel'/>
</source>
</description>
<transport ufrag='someufrag' pwd='somepwd' xmlns='urn:xmpp:jingle:transports:ice-udp:1'>
<fingerprint hash='sha-256' required='false' setup='actpass' xmlns='urn:xmpp:jingle:apps:dtls:0'>09:B1:51:0F:85:4C:80:19:A1:AF:81:73:47:EE:ED:3D:00:3A:84:C7:76:C1:4E:34:BE:56:F6:42:AD:15:D5:D7</fingerprint>
<candidate foundation='1' id='3cbe5aea5bde0c1401a60bbc2' network='0' protocol='udp' generation='0' port='10000' priority='2130706431' type='host' ip='10.0.0.1' component='1'/>
<candidate rel-addr='10.0.0.1' network='0' foundation='2' id='dfcfd075bde0c140ffffffff927646ba' port='10000' protocol='udp' generation='0' rel-port='10000' priority='1694498815' type='srflx' ip='10.0.0.2' component='1'/>
</transport>
</content>
<group semantics='BUNDLE' xmlns='urn:xmpp:jingle:apps:grouping:0'>
<content name='audio'/>
<content name='video'/>
</group>
</jingle></iq>`;
const expectedSDP = `v=0
o=- 123 2 IN IP4 0.0.0.0
s=-
t=0 0
a=group:BUNDLE audio video
m=audio 1 RTP/SAVPF 111 103 104 126
c=IN IP4 0.0.0.0
a=rtcp:1 IN IP4 0.0.0.0
a=ice-ufrag:someufrag
a=ice-pwd:somepwd
a=fingerprint:sha-256 09:B1:51:0F:85:4C:80:19:A1:AF:81:73:47:EE:ED:3D:00:3A:84:C7:76:C1:4E:34:BE:56:F6:42:AD:15:D5:D7
a=setup:actpass
a=candidate:1 1 udp 2130706431 10.0.0.1 10000 typ host generation 0
a=candidate:2 1 udp 1694498815 10.0.0.2 10000 typ srflx raddr 10.0.0.1 rport 10000 generation 0
a=sendrecv
a=mid:audio
a=rtcp-mux
a=rtpmap:111 opus/48000/2
a=fmtp:111 minptime=10; useinbandfec=1
a=rtcp-fb:111 transport-cc
a=rtpmap:103 ISAC/16000
a=rtpmap:104 ISAC/32000
a=rtpmap:126 telephone-event/8000
a=fmtp:126 0-15
a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level
a=extmap:5 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01
a=ssrc:4039389863 cname:mixed
a=ssrc:4039389863 label:mixedlabelaudio0
a=ssrc:4039389863 msid:mixedmslabel mixedlabelaudio0
a=ssrc:4039389863 mslabel:mixedmslabel
m=video 1 RTP/SAVPF 100 96
c=IN IP4 0.0.0.0
a=rtcp:1 IN IP4 0.0.0.0
a=ice-ufrag:someufrag
a=ice-pwd:somepwd
a=fingerprint:sha-256 09:B1:51:0F:85:4C:80:19:A1:AF:81:73:47:EE:ED:3D:00:3A:84:C7:76:C1:4E:34:BE:56:F6:42:AD:15:D5:D7
a=setup:actpass
a=candidate:1 1 udp 2130706431 10.0.0.1 10000 typ host generation 0
a=candidate:2 1 udp 1694498815 10.0.0.2 10000 typ srflx raddr 10.0.0.1 rport 10000 generation 0
a=sendrecv
a=mid:video
a=rtcp-mux
a=rtpmap:100 VP8/90000
a=fmtp:100 x-google-start-bitrate=800
a=rtcp-fb:100 ccm fir
a=rtcp-fb:100 nack
a=rtcp-fb:100 nack pli
a=rtcp-fb:100 goog-remb
a=rtcp-fb:100 transport-cc
a=rtpmap:96 rtx/90000
a=fmtp:96 apt=100
a=rtcp-fb:96 ccm fir
a=rtcp-fb:96 nack
a=rtcp-fb:96 nack pli
a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=extmap:5 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01
a=ssrc:3758540092 cname:mixed
a=ssrc:3758540092 label:mixedlabelvideo0
a=ssrc:3758540092 msid:mixedmslabel mixedlabelvideo0
a=ssrc:3758540092 mslabel:mixedmslabel
`.split('\n').join('\r\n');
/* eslint-enable max-len*/
it('gets converted to SDP', () => {
const offer = createStanzaElement(stanza);
const sdp = new SDP('');
sdp.fromJingle($(offer).find('>jingle'));
const rawSDP = sdp.raw.replace(/o=- \d+/, 'o=- 123'); // replace generated o= timestamp.
expect(rawSDP).toEqual(expectedSDP);
});
});
});
/* global __filename */
import { getLogger } from 'jitsi-meet-logger';
import {

@@ -5,0 +6,0 @@ parsePrimarySSRC,

import { getLogger } from 'jitsi-meet-logger';
const logger = getLogger(__filename);
import CodecMimeType from '../../service/RTC/CodecMimeType';
import browser from '../browser';
import RandomUtil from '../util/RandomUtil';
import browser from '../browser';

@@ -512,3 +513,3 @@ const SDPUtil = {

* @param {object} mLine an mLine object as parsed from transform.parse
* @param {number} ssrc the ssrc for which an attribtue is desired
* @param {number} ssrc the ssrc for which an attribute is desired
* @param {string} attributeName the name of the desired attribute

@@ -568,42 +569,33 @@ * @returns {string} the value corresponding to the given ssrc

/**
* Sets the given codecName as the preferred codec by
* moving it to the beginning of the payload types
* list (modifies the given mline in place). If there
* are multiple options within the same codec (multiple h264
* profiles, for instance), this will prefer the first one
* that is found.
* @param {object} videoMLine the video mline object from
* an sdp as parsed by transform.parse
* Sets the given codecName as the preferred codec by moving it to the beginning
* of the payload types list (modifies the given mline in place). All instances
* of the codec are moved up.
* @param {object} mLine the mline object from an sdp as parsed by transform.parse
* @param {string} codecName the name of the preferred codec
*/
preferVideoCodec(videoMLine, codecName) {
let payloadType = null;
if (!videoMLine || !codecName) {
preferCodec(mline, codecName) {
if (!mline || !codecName) {
return;
}
for (let i = 0; i < videoMLine.rtp.length; ++i) {
const rtp = videoMLine.rtp[i];
const matchingPayloadTypes = mline.rtp
.filter(rtp => rtp.codec && rtp.codec.toLowerCase() === codecName.toLowerCase())
.map(rtp => rtp.payload);
if (rtp.codec
&& rtp.codec.toLowerCase() === codecName.toLowerCase()) {
payloadType = rtp.payload;
break;
}
}
if (payloadType) {
// Call toString() on payloads to get around an issue within
// SDPTransform that sets payloads as a number, instead of a string,
// when there is only one payload.
if (matchingPayloadTypes) {
// Call toString() on payloads to get around an issue within SDPTransform that sets
// payloads as a number, instead of a string, when there is only one payload.
const payloadTypes
= videoMLine.payloads
.toString()
.split(' ')
.map(p => parseInt(p, 10));
const payloadIndex = payloadTypes.indexOf(payloadType);
= mline.payloads
.toString()
.split(' ')
.map(p => parseInt(p, 10));
payloadTypes.splice(payloadIndex, 1);
payloadTypes.unshift(payloadType);
videoMLine.payloads = payloadTypes.join(' ');
for (const pt of matchingPayloadTypes.reverse()) {
const payloadIndex = payloadTypes.indexOf(pt);
payloadTypes.splice(payloadIndex, 1);
payloadTypes.unshift(pt);
}
mline.payloads = payloadTypes.join(' ');
}

@@ -617,20 +609,34 @@ },

*
* @param {object} videoMLine the video mline object from an sdp as parsed
* by transform.parse.
* @param {object} mLine the mline object from an sdp as parsed by transform.parse.
* @param {string} codecName the name of the codec which will be stripped.
* @param {boolean} highProfile determines if only the high profile H264 codec needs to be
* stripped from the sdp when the passed codecName is H264.
*/
stripVideoCodec(videoMLine, codecName) {
if (!videoMLine || !codecName) {
stripCodec(mLine, codecName, highProfile = false) {
if (!mLine || !codecName) {
return;
}
const removePts = [];
const h264Pts = [];
let removePts = [];
const stripH264HighCodec = codecName.toLowerCase() === CodecMimeType.H264 && highProfile;
for (const rtp of videoMLine.rtp) {
for (const rtp of mLine.rtp) {
if (rtp.codec
&& rtp.codec.toLowerCase() === codecName.toLowerCase()) {
removePts.push(rtp.payload);
if (stripH264HighCodec) {
h264Pts.push(rtp.payload);
} else {
removePts.push(rtp.payload);
}
}
}
// high profile H264 codecs have 64 as the first two bytes of the profile-level-id.
if (stripH264HighCodec) {
removePts = mLine.fmtp
.filter(item => h264Pts.indexOf(item.payload) > -1 && item.config.includes('profile-level-id=64'))
.map(item => item.payload);
}
if (removePts.length > 0) {

@@ -640,3 +646,3 @@ // We also need to remove the payload types that are related to RTX

const rtxApts = removePts.map(item => `apt=${item}`);
const rtxPts = videoMLine.fmtp.filter(
const rtxPts = mLine.fmtp.filter(
item => rtxApts.indexOf(item.config) !== -1);

@@ -649,3 +655,3 @@

// when there is only one payload.
const allPts = videoMLine.payloads
const allPts = mLine.payloads
.toString()

@@ -657,16 +663,16 @@ .split(' ')

if (keepPts.length === 0) {
// There are no other video codecs, disable the stream.
videoMLine.port = 0;
videoMLine.direction = 'inactive';
videoMLine.payloads = '*';
// There are no other codecs, disable the stream.
mLine.port = 0;
mLine.direction = 'inactive';
mLine.payloads = '*';
} else {
videoMLine.payloads = keepPts.join(' ');
mLine.payloads = keepPts.join(' ');
}
videoMLine.rtp = videoMLine.rtp.filter(
mLine.rtp = mLine.rtp.filter(
item => keepPts.indexOf(item.payload) !== -1);
videoMLine.fmtp = videoMLine.fmtp.filter(
mLine.fmtp = mLine.fmtp.filter(
item => keepPts.indexOf(item.payload) !== -1);
if (videoMLine.rtcpFb) {
videoMLine.rtcpFb = videoMLine.rtcpFb.filter(
if (mLine.rtcpFb) {
mLine.rtcpFb = mLine.rtcpFb.filter(
item => keepPts.indexOf(item.payload) !== -1);

@@ -673,0 +679,0 @@ }

import SDPUtil from './SDPUtil';
import * as SampleSdpStrings from './SampleSdpStrings.js';
import { default as SampleSdpStrings } from './SampleSdpStrings.js';

@@ -12,8 +12,8 @@ describe('SDPUtil', () => {

describe('preferVideoCodec', () => {
it('should move a preferred codec to the front', () => {
describe('preferCodec for video codec', () => {
it('should move a preferred video codec to the front', () => {
const sdp = SampleSdpStrings.multiCodecVideoSdp;
const videoMLine = sdp.media.find(m => m.type === 'video');
SDPUtil.preferVideoCodec(videoMLine, 'H264');
SDPUtil.preferCodec(videoMLine, 'H264');
const newPayloadTypesOrder

@@ -23,18 +23,47 @@ = videoMLine.payloads.split(' ').map(

expect(newPayloadTypesOrder[0]).toEqual(126);
expect(newPayloadTypesOrder[0]).toEqual(102);
expect(newPayloadTypesOrder[1]).toEqual(127);
});
});
describe('stripVideoCodec', () => {
it('should remove a codec', () => {
describe('preferCodec for audio codec', () => {
it('should move a preferred audio codec to the front', () => {
const sdp = SampleSdpStrings.multiCodecVideoSdp;
const audioMLine = sdp.media.find(m => m.type === 'audio');
SDPUtil.preferCodec(audioMLine, 'ISAC');
const newPayloadTypesOrder
= audioMLine.payloads.split(' ').map(
ptStr => parseInt(ptStr, 10));
expect(newPayloadTypesOrder[0]).toEqual(103);
expect(newPayloadTypesOrder[1]).toEqual(104);
});
});
describe('strip Video Codec', () => {
it('should remove a video codec', () => {
const sdp = SampleSdpStrings.multiCodecVideoSdp;
const videoMLine = sdp.media.find(m => m.type === 'video');
SDPUtil.stripVideoCodec(videoMLine, 'H264');
SDPUtil.stripCodec(videoMLine, 'H264');
const newPayloadTypes = videoMLine.payloads.split(' ').map(Number);
expect(newPayloadTypes.length).toEqual(1);
expect(newPayloadTypes[0]).toEqual(100);
expect(newPayloadTypes.length).toEqual(4);
expect(newPayloadTypes[0]).toEqual(96);
});
});
describe('strip Audio Codec', () => {
it('should remove an audio codec', () => {
const sdp = SampleSdpStrings.multiCodecVideoSdp;
const audioMLine = sdp.media.find(m => m.type === 'audio');
SDPUtil.stripCodec(audioMLine, 'OPUS');
const newPayloadTypes = audioMLine.payloads.split(' ').map(Number);
expect(newPayloadTypes.length).toEqual(3);
expect(newPayloadTypes[0]).toEqual(103);
});
});
});
/* global __filename */
import { getLogger } from 'jitsi-meet-logger';
import * as MediaType from '../../service/RTC/MediaType';

@@ -5,0 +6,0 @@ import * as SignalingEvents from '../../service/RTC/SignalingEvents';

@@ -6,5 +6,6 @@ /* global $ */

import XMPPEvents from '../../service/xmpp/XMPPEvents';
import ChatRoom from './ChatRoom';
import { ConnectionPluginListenable } from './ConnectionPlugin';
import XMPPEvents from '../../service/xmpp/XMPPEvents';

@@ -11,0 +12,0 @@ const logger = getLogger(__filename);

/* global $, __filename */
import { getLogger } from 'jitsi-meet-logger';
import { $iq, Strophe } from 'strophe.js';
import {

@@ -8,12 +11,9 @@ ACTION_JINGLE_TR_RECEIVED,

} from '../../service/statistics/AnalyticsEvents';
import { getLogger } from 'jitsi-meet-logger';
import { $iq, Strophe } from 'strophe.js';
import XMPPEvents from '../../service/xmpp/XMPPEvents';
import Statistics from '../statistics/statistics';
import GlobalOnErrorHandler from '../util/GlobalOnErrorHandler';
import RandomUtil from '../util/RandomUtil';
import Statistics from '../statistics/statistics';
import ConnectionPlugin from './ConnectionPlugin';
import JingleSessionPC from './JingleSessionPC';
import ConnectionPlugin from './ConnectionPlugin';

@@ -153,2 +153,5 @@ const logger = getLogger(__filename);

} as ${isP2P ? '' : '*not*'} P2P`);
const iceConfig = isP2P ? this.p2pIceConfig : this.jvbIceConfig;
sess

@@ -161,3 +164,6 @@ = new JingleSessionPC(

this.mediaConstraints,
isP2P ? this.p2pIceConfig : this.jvbIceConfig,
// Makes a copy in order to prevent exception thrown on RN when either this.p2pIceConfig or
// this.jvbIceConfig is modified and there's a PeerConnection instance holding a reference
JSON.parse(JSON.stringify(iceConfig)),
isP2P,

@@ -366,13 +372,22 @@ /* initiator */ false);

if (options.useStunTurn) {
// we want to filter and leave only tcp/turns candidates
// which make sense for the jvb connections
this.jvbIceConfig.iceServers
= iceservers.filter(s => s.urls.startsWith('turns'));
// Shuffle ICEServers for loadbalancing
for (let i = iceservers.length - 1; i > 0; i--) {
const j = Math.floor(Math.random() * (i + 1));
const temp = iceservers[i];
iceservers[i] = iceservers[j];
iceservers[j] = temp;
}
if (options.p2p && options.p2p.useStunTurn) {
this.p2pIceConfig.iceServers = iceservers;
let filter;
if (options.useTurnUdp) {
filter = s => s.urls.startsWith('turn');
} else {
// By default we filter out STUN and TURN/UDP and leave only TURN/TCP.
filter = s => s.urls.startsWith('turn') && (s.urls.indexOf('transport=tcp') >= 0);
}
this.jvbIceConfig.iceServers = iceservers.filter(filter);
this.p2pIceConfig.iceServers = iceservers;
}, err => {

@@ -379,0 +394,0 @@ logger.warn('getting turn credentials failed', err);

@@ -17,10 +17,12 @@ import { getLogger } from 'jitsi-meet-logger';

/**
* Ping timeout error after 15 sec of waiting.
* Ping timeout error after 5 sec of waiting.
*/
const PING_TIMEOUT = 15000;
const PING_TIMEOUT = 5000;
/**
* Will close the connection after 3 consecutive ping errors.
* How many ping failures will be tolerated before the WebSocket connection is killed.
* The worst case scenario in case of ping timing out without a response is (25 seconds at the time of this writing):
* PING_THRESHOLD * PING_INTERVAL + PING_TIMEOUT
*/
const PING_THRESHOLD = 3;
const PING_THRESHOLD = 2;

@@ -42,10 +44,12 @@ /**

* Contructs new object
* @param {XMPP} xmpp the xmpp module.
* @param {Object} options
* @param {Function} options.onPingThresholdExceeded - Callback called when ping fails too many times (controlled
* by the {@link PING_THRESHOLD} constant).
* @constructor
*/
constructor(xmpp) {
constructor({ onPingThresholdExceeded }) {
super();
this.failedPings = 0;
this.xmpp = xmpp;
this.pingExecIntervals = new Array(PING_TIMESTAMPS_TO_KEEP);
this._onPingThresholdExceeded = onPingThresholdExceeded;
}

@@ -81,3 +85,4 @@

iq.c('ping', { xmlns: Strophe.NS.PING });
this.connection.sendIQ(iq, success, error, timeout);
this.connection.sendIQ2(iq, { timeout })
.then(success, error);
}

@@ -106,9 +111,3 @@

logger.error(errmsg, error);
// FIXME it doesn't help to disconnect when 3rd PING
// times out, it only stops Strophe from retrying.
// Not really sure what's the right thing to do in that
// situation, but just closing the connection makes no
// sense.
// self.connection.disconnect();
this._onPingThresholdExceeded && this._onPingThresholdExceeded();
} else {

@@ -115,0 +114,0 @@ logger.warn(errmsg, error);

@@ -7,17 +7,18 @@ /* global $ */

import RandomUtil from '../util/RandomUtil';
import * as JitsiConnectionErrors from '../../JitsiConnectionErrors';
import * as JitsiConnectionEvents from '../../JitsiConnectionEvents';
import XMPPEvents from '../../service/xmpp/XMPPEvents';
import browser from '../browser';
import { E2EEncryption } from '../e2ee/E2EEncryption';
import GlobalOnErrorHandler from '../util/GlobalOnErrorHandler';
import Listenable from '../util/Listenable';
import RandomUtil from '../util/RandomUtil';
import Caps from './Caps';
import XmppConnection from './XmppConnection';
import MucConnectionPlugin from './strophe.emuc';
import JingleConnectionPlugin from './strophe.jingle';
import initStropheLogger from './strophe.logger';
import RayoConnectionPlugin from './strophe.rayo';
import initStropheUtil from './strophe.util';
import PingConnectionPlugin from './strophe.ping';
import RayoConnectionPlugin from './strophe.rayo';
import initStropheLogger from './strophe.logger';
import Listenable from '../util/Listenable';
import Caps from './Caps';
import GlobalOnErrorHandler from '../util/GlobalOnErrorHandler';
import XMPPEvents from '../../service/xmpp/XMPPEvents';
import XmppConnection from './XmppConnection';

@@ -151,5 +152,9 @@ const logger = getLogger(__filename);

if (!this.options.disableRtx && browser.supportsRtx()) {
// Disable RTX on Firefox because of https://bugzilla.mozilla.org/show_bug.cgi?id=1668028.
if (!(this.options.disableRtx || browser.isFirefox())) {
this.caps.addFeature('urn:ietf:rfc:4588');
}
if (this.options.enableOpusRed === true && browser.supportsAudioRed()) {
this.caps.addFeature('http://jitsi.org/opus-red');
}

@@ -168,3 +173,3 @@ // this is dealt with by SDP O/A so we don't need to announce this

// Enable Lipsync ?
if (browser.isChrome() && this.options.enableLipSync !== false) {
if (browser.isChromiumBased() && this.options.enableLipSync === true) {
logger.info('Lip-sync enabled !');

@@ -177,11 +182,6 @@ this.caps.addFeature('http://jitsi.org/meet/lipsync');

}
}
/**
* Returns {@code true} if the PING functionality is supported by the server
* or {@code false} otherwise.
* @returns {boolean}
*/
isPingSupported() {
return this._pingSupported !== false;
if (E2EEncryption.isSupported(this.options)) {
this.caps.addFeature('https://jitsi.org/meet/e2ee');
}
}

@@ -217,12 +217,9 @@

this.eventEmitter.emit(XMPPEvents.CONNECTION_STATUS_CHANGED, credentials, status, msg);
if (status === Strophe.Status.CONNECTED
|| status === Strophe.Status.ATTACHED) {
if (this.options.useStunTurn
|| (this.options.p2p && this.options.p2p.useStunTurn)) {
this.connection.jingle.getStunAndTurnCredentials();
}
if (status === Strophe.Status.CONNECTED || status === Strophe.Status.ATTACHED) {
this.connection.jingle.getStunAndTurnCredentials();
logger.info(`My Jabber ID: ${this.connection.jid}`);
this.lastErrorMsg = undefined;
// XmppConnection emits CONNECTED again on reconnect - a good opportunity to clear any "last error" flags
this._resetState();

@@ -235,7 +232,5 @@ // Schedule ping ?

.then(({ features, identities }) => {
if (features.has(Strophe.NS.PING)) {
this._pingSupported = true;
this.connection.ping.startInterval(pingJid);
} else {
logger.warn(`Ping NOT supported by ${pingJid}`);
if (!features.has(Strophe.NS.PING)) {
logger.error(
`Ping NOT supported by ${pingJid} - please enable ping in your XMPP server config`);
}

@@ -252,2 +247,16 @@

}
if (identity.type === 'lobbyrooms') {
this.lobbySupported = true;
identity.name && this.caps.getFeaturesAndIdentities(identity.name, identity.type)
.then(({ features: f }) => {
f.forEach(fr => {
if (fr.endsWith('#displayname_required')) {
this.eventEmitter.emit(
JitsiConnectionEvents.DISPLAY_NAME_REQUIRED);
}
});
})
.catch(e => logger.warn('Error getting features from lobby.', e && e.message));
}
});

@@ -298,6 +307,5 @@

this.connection.ping.stopInterval();
const wasIntentionalDisconnect = this.disconnectInProgress;
const wasIntentionalDisconnect = Boolean(this.disconnectInProgress);
const errMsg = msg || this.lastErrorMsg;
this.disconnectInProgress = false;
if (this.anonymousConnectionFailed) {

@@ -389,5 +397,3 @@ // prompt user for username and password

this.anonymousConnectionFailed = false;
this.connectionFailed = false;
this.lastErrorMsg = undefined;
this._resetState();
this.connection.connect(

@@ -410,2 +416,3 @@ jid,

attach(options) {
this._resetState();
const now = this.connectionTimes.attaching = window.performance.now();

@@ -423,2 +430,13 @@

/**
* Resets any state/flag before starting a new connection.
* @private
*/
_resetState() {
this.anonymousConnectionFailed = false;
this.connectionFailed = false;
this.lastErrorMsg = undefined;
this.disconnectInProgress = undefined;
}
/**
*

@@ -469,3 +487,5 @@ * @param jid

createRoom(roomName, options, onCreateResource) {
let roomjid = `${roomName}@${this.options.hosts.muc}/`;
// There are cases (when using subdomain) where muc can hold an uppercase part
let roomjid = `${roomName}@${options.customDomain
? options.customDomain : this.options.hosts.muc.toLowerCase()}/`;

@@ -517,4 +537,3 @@ const mucNickname = onCreateResource

/**
* Pings the server. Remember to check {@link isPingSupported} before using
* this method.
* Pings the server.
* @param timeout how many ms before a timeout should occur.

@@ -526,8 +545,4 @@ * @returns {Promise} resolved on ping success and reject on an error or

return new Promise((resolve, reject) => {
if (this.isPingSupported()) {
this.connection.ping
this.connection.ping
.ping(this.connection.domain, resolve, reject, timeout);
} else {
reject('PING operation is not supported by the server');
}
});

@@ -551,11 +566,9 @@ }

disconnect(ev) {
if (this.disconnectInProgress || !this.connection) {
this.eventEmitter.emit(JitsiConnectionEvents.WRONG_STATE);
return Promise.reject(new Error('Wrong connection state!'));
if (this.disconnectInProgress) {
return this.disconnectInProgress;
} else if (!this.connection) {
return Promise.resolve();
}
this.disconnectInProgress = true;
return new Promise(resolve => {
this.disconnectInProgress = new Promise(resolve => {
const disconnectListener = (credentials, status) => {

@@ -569,5 +582,7 @@ if (status === Strophe.Status.DISCONNECTED) {

this.eventEmitter.on(XMPPEvents.CONNECTION_STATUS_CHANGED, disconnectListener);
});
this._cleanupXmppConnection(ev);
});
this._cleanupXmppConnection(ev);
return this.disconnectInProgress;
}

@@ -646,3 +661,2 @@

this.connection.addConnectionPlugin('jingle', new JingleConnectionPlugin(this, this.eventEmitter, iceConfig));
this.connection.addConnectionPlugin('ping', new PingConnectionPlugin(this));
this.connection.addConnectionPlugin('rayo', new RayoConnectionPlugin());

@@ -688,3 +702,3 @@ }

details.suspend_time = this.connection.ping.getPingSuspendTime();
details.time_since_last_success = this.connection.getTimeSinceLastBOSHSuccess();
details.time_since_last_success = this.connection.getTimeSinceLastSuccess();
/* eslint-enable camelcase */

@@ -747,2 +761,4 @@

} catch (e) {
logger.error(e);
return false;

@@ -766,3 +782,3 @@ }

|| from === this.conferenceDurationComponentAddress)) {
return;
return true;
}

@@ -769,0 +785,0 @@

@@ -6,5 +6,6 @@ import { getLogger } from 'jitsi-meet-logger';

import Listenable from '../util/Listenable';
import { getJitterDelay } from '../util/Retry';
import LastSuccessTracker from './StropheBoshLastSuccess';
import ResumeTask from './ResumeTask';
import LastSuccessTracker from './StropheLastSuccess';
import PingConnectionPlugin from './strophe.ping';

@@ -56,8 +57,2 @@ const logger = getLogger(__filename);

/**
* The counter increased before each resume retry attempt, used to calculate exponential backoff.
* @type {number}
* @private
*/
this._resumeRetryN = 0;
this._stropheConn = new Strophe.Connection(serviceUrl);

@@ -69,6 +64,28 @@ this._usesWebsocket = serviceUrl.startsWith('ws:') || serviceUrl.startsWith('wss:');

if (!this._usesWebsocket) {
this._lastSuccessTracker = new LastSuccessTracker();
this._lastSuccessTracker.startTracking(this._stropheConn);
}
this._lastSuccessTracker = new LastSuccessTracker();
this._lastSuccessTracker.startTracking(this, this._stropheConn);
this._resumeTask = new ResumeTask(this._stropheConn);
/**
* @typedef DeferredSendIQ Object
* @property {Element} iq - The IQ to send.
* @property {function} resolve - The resolve method of the deferred Promise.
* @property {function} reject - The reject method of the deferred Promise.
* @property {number} timeout - The ID of the timeout task that needs to be cleared, before sending the IQ.
*/
/**
* Deferred IQs to be sent upon reconnect.
* @type {Array<DeferredSendIQ>}
* @private
*/
this._deferredIQs = [];
// Ping plugin is mandatory for the Websocket mode to work correctly. It's used to detect when the connection
// is broken (WebSocket/TCP connection not closed gracefully).
this.addConnectionPlugin(
'ping',
new PingConnectionPlugin({
onPingThresholdExceeded: () => this._onPingErrorThresholdExceeded()
}));
}

@@ -82,3 +99,6 @@

get connected() {
return this._status === Strophe.Status.CONNECTED || this._status === Strophe.Status.ATTACHED;
const websocket = this._stropheConn && this._stropheConn._proto && this._stropheConn._proto.socket;
return (this._status === Strophe.Status.CONNECTED || this._status === Strophe.Status.ATTACHED)
&& (!this.isUsingWebSocket || (websocket && websocket.readyState === WebSocket.OPEN));
}

@@ -234,7 +254,11 @@

if (status === Strophe.Status.CONNECTED) {
if (status === Strophe.Status.CONNECTED || status === Strophe.Status.ATTACHED) {
this._maybeEnableStreamResume();
this._maybeStartWSKeepAlive();
this._resumeRetryN = 0;
this._processDeferredIQs();
this._resumeTask.cancel();
this.ping.startInterval(this.domain);
} else if (status === Strophe.Status.DISCONNECTED) {
this.ping.stopInterval();
// FIXME add RECONNECTING state instead of blocking the DISCONNECTED update

@@ -254,2 +278,14 @@ blockCallback = this._tryResumingConnection();

/**
* Clears the list of IQs and rejects deferred Promises with an error.
*
* @private
*/
_clearDeferredIQs() {
for (const deferred of this._deferredIQs) {
deferred.reject(new Error('disconnect'));
}
this._deferredIQs = [];
}
/**
* The method is meant to be used for testing. It's a shortcut for closing the WebSocket.

@@ -260,3 +296,6 @@ *

closeWebsocket() {
this._stropheConn._proto && this._stropheConn._proto.socket && this._stropheConn._proto.socket.close();
if (this._stropheConn && this._stropheConn._proto) {
this._stropheConn._proto._closeSocket();
this._stropheConn._proto._onClose(null);
}
}

@@ -270,4 +309,5 @@

disconnect(...args) {
clearTimeout(this._resumeTimeout);
this._resumeTask.cancel();
clearTimeout(this._wsKeepAlive);
this._clearDeferredIQs();
this._stropheConn.disconnect(...args);

@@ -290,6 +330,4 @@ }

*/
getTimeSinceLastBOSHSuccess() {
return this._lastSuccessTracker
? this._lastSuccessTracker.getTimeSinceLastSuccess()
: null;
getTimeSinceLastSuccess() {
return this._lastSuccessTracker.getTimeSinceLastSuccess();
}

@@ -341,3 +379,3 @@

this._wsKeepAlive = setTimeout(() => {
const url = this.service.replace('wss', 'https').replace('ws', 'http');
const url = this.service.replace('wss://', 'https://').replace('ws://', 'http://');

@@ -354,2 +392,26 @@ fetch(url).catch(

/**
* Goes over the list of {@link DeferredSendIQ} tasks and sends them.
*
* @private
* @returns {void}
*/
_processDeferredIQs() {
for (const deferred of this._deferredIQs) {
if (deferred.iq) {
clearTimeout(deferred.timeout);
const timeLeft = Date.now() - deferred.start;
this.sendIQ(
deferred.iq,
result => deferred.resolve(result),
error => deferred.reject(error),
timeLeft);
}
}
this._deferredIQs = [];
}
/**
* Send a stanza. This function is called to push data onto the send queue to go out over the wire.

@@ -388,2 +450,50 @@ *

/**
* Sends an IQ immediately if connected or puts it on the send queue otherwise(in contrary to other send methods
* which would fail immediately if disconnected).
*
* @param {Element} iq - The IQ to send.
* @param {number} timeout - How long to wait for the response. The time when the connection is reconnecting is
* included, which means that the IQ may never be sent and still fail with a timeout.
*/
sendIQ2(iq, { timeout }) {
return new Promise((resolve, reject) => {
if (this.connected) {
this.sendIQ(
iq,
result => resolve(result),
error => reject(error),
timeout);
} else {
const deferred = {
iq,
resolve,
reject,
start: Date.now(),
timeout: setTimeout(() => {
// clears the IQ on timeout and invalidates the deferred task
deferred.iq = undefined;
// Strophe calls with undefined on timeout
reject(undefined);
}, timeout)
};
this._deferredIQs.push(deferred);
}
});
}
/**
* Called by the ping plugin when ping fails too many times.
*
* @returns {void}
*/
_onPingErrorThresholdExceeded() {
if (this.isUsingWebSocket) {
logger.warn('Ping error threshold exceeded - killing the WebSocket');
this.closeWebsocket();
}
}
/**
* Helper function to send presence stanzas. The main benefit is for sending presence stanzas for which you expect

@@ -433,3 +543,3 @@ * a responding presence stanza with the same id (for example when leaving a chat room).

const res = navigator.sendBeacon(
`https:${this.service}`,
this.service.indexOf('https://') === -1 ? `https:${this.service}` : this.service,
Strophe.serialize(body.tree()));

@@ -458,26 +568,4 @@

if (resumeToken) {
clearTimeout(this._resumeTimeout);
this._resumeTask.schedule();
// FIXME detect internet offline
// The retry delay will be:
// 1st retry: 1.5s - 3s
// 2nd retry: 3s - 9s
// 3rd retry: 3s - 27s
this._resumeRetryN = Math.min(3, this._resumeRetryN + 1);
const retryTimeout = getJitterDelay(this._resumeRetryN, 1500, 3);
logger.info(`Will try to resume the XMPP connection in ${retryTimeout}ms`);
this._resumeTimeout = setTimeout(() => {
logger.info('Trying to resume the XMPP connection');
const url = new URL(this._stropheConn.service);
url.searchParams.set('previd', resumeToken);
this._stropheConn.service = url.toString();
streamManagement.resume();
}, retryTimeout);
return true;

@@ -484,0 +572,0 @@ }

{
"name": "@q42/lib-jitsi-meet",
"version": "2.0.4289",
"description": "Borrel fork for accessing Jitsi server side deployments",
"repository": {
"type": "git",
"url": "https://github.com/Q42/Borrel.JitsiMeet.Lib"
},
"keywords": [
"jingle",
"webrtc",
"xmpp",
"browser",
"jitsi"
],
"author": "",
"readmeFilename": "README.md",
"dependencies": {
"@jitsi/js-utils": "^1.0.2",
"@jitsi/sdp-interop": "0.1.14",
"@jitsi/sdp-simulcast": "0.2.2",
"async": "0.9.0",
"current-executing-script": "0.1.3",
"jitsi-meet-logger": "github:jitsi/jitsi-meet-logger#5ec92357570dc8f0b7ffc1528820721c84c6af8b",
"lodash.isequal": "4.5.0",
"sdp-transform": "2.3.0",
"strophe.js": "1.3.4",
"strophejs-plugin-disco": "0.0.2",
"strophejs-plugin-stream-management": "github:jitsi/strophejs-plugin-stream-management#cec7608601c1bc098543823fc658e3ddf758c009",
"webrtc-adapter": "github:webrtc/adapter#1eec19782b4058d186341263e7d049cea3e3290a"
},
"devDependencies": {
"@babel/core": "7.5.5",
"@babel/plugin-proposal-class-properties": "7.1.0",
"@babel/plugin-proposal-export-namespace-from": "7.0.0",
"@babel/plugin-transform-flow-strip-types": "7.0.0",
"@babel/preset-env": "7.1.0",
"@babel/preset-flow": "7.0.0",
"babel-eslint": "10.0.1",
"babel-loader": "8.0.4",
"core-js": "2.5.1",
"eslint": "4.12.1",
"eslint-config-jitsi": "github:jitsi/eslint-config-jitsi#1.0.0",
"eslint-plugin-flowtype": "2.39.1",
"eslint-plugin-import": "2.8.0",
"flow-bin": "0.104.0",
"jasmine-core": "2.5.2",
"karma": "3.0.0",
"karma-chrome-launcher": "2.2.0",
"karma-jasmine": "1.1.2",
"karma-webpack": "3.0.0",
"precommit-hook": "3.0.0",
"string-replace-loader": "2.1.1",
"webpack": "4.26.1",
"webpack-bundle-analyzer": "3.4.1",
"webpack-cli": "3.1.2"
},
"scripts": {
"lint": "eslint . && flow",
"test": "karma start karma.conf.js",
"test-watch": "karma start karma.conf.js --no-single-run",
"validate": "npm ls",
"build": "webpack -p"
},
"pre-commit": [
"lint",
"test"
],
"main": "./index.js",
"license": "Apache-2.0"
"name": "@q42/lib-jitsi-meet",
"version": "2.0.5176",
"description": "Borrel fork for accessing Jitsi server side deployments",
"repository": {
"type": "git",
"url": "https://github.com/Q42/Borrel.JitsiMeet.Lib"
},
"keywords": [
"jingle",
"webrtc",
"xmpp",
"browser",
"jitsi"
],
"author": "",
"readmeFilename": "README.md",
"dependencies": {
"@jitsi/js-utils": "1.0.2",
"@jitsi/sdp-interop": "1.0.3",
"@jitsi/sdp-simulcast": "0.4.0",
"async": "0.9.0",
"base64-js": "1.3.1",
"current-executing-script": "0.1.3",
"jitsi-meet-logger": "github:jitsi/jitsi-meet-logger#5ec92357570dc8f0b7ffc1528820721c84c6af8b",
"lodash.clonedeep": "4.5.0",
"lodash.debounce": "4.0.8",
"lodash.isequal": "4.5.0",
"sdp-transform": "2.3.0",
"strophe.js": "1.3.4",
"strophejs-plugin-disco": "0.0.2",
"strophejs-plugin-stream-management": "github:jitsi/strophejs-plugin-stream-management#001cf02bef2357234e1ac5d163611b4d60bf2b6a",
"uuid": "8.1.0",
"webrtc-adapter": "7.5.0"
},
"devDependencies": {
"@babel/core": "7.5.5",
"@babel/plugin-proposal-class-properties": "7.1.0",
"@babel/plugin-proposal-export-namespace-from": "7.0.0",
"@babel/plugin-proposal-optional-chaining": "7.2.0",
"@babel/plugin-transform-flow-strip-types": "7.0.0",
"@babel/preset-env": "7.1.0",
"@babel/preset-flow": "7.0.0",
"babel-eslint": "10.0.1",
"babel-loader": "8.0.4",
"core-js": "2.5.1",
"eslint": "5.6.1",
"eslint-config-jitsi": "github:jitsi/eslint-config-jitsi#1.0.3",
"eslint-plugin-flowtype": "2.50.3",
"eslint-plugin-import": "2.20.2",
"flow-bin": "0.104.0",
"jasmine-core": "3.5.0",
"karma": "5.1.1",
"karma-chrome-launcher": "3.1.0",
"karma-jasmine": "3.1.1",
"karma-sourcemap-loader": "0.3.7",
"karma-webpack": "4.0.2",
"string-replace-loader": "2.1.1",
"webpack": "4.43.0",
"webpack-bundle-analyzer": "3.4.1",
"webpack-cli": "3.3.11"
},
"scripts": {
"lint": "eslint . && flow",
"postinstall": "webpack -p",
"test": "karma start karma.conf.js",
"test-watch": "karma start karma.conf.js --no-single-run",
"validate": "npm ls",
"watch": "webpack --config webpack.config.js --watch --mode development"
},
"main": "./index.js",
"license": "Apache-2.0"
}

@@ -7,6 +7,9 @@ # Jitsi Meet API library

[Checkout the examples.](doc/API.md#installation)
- [Installation guide](doc/API.md#installation)
- [Checkout the example](doc/example)
## Building the sources
NOTE: you need Node.js >= 12 and npm >= 6
To build the library, just type:

@@ -16,1 +19,16 @@ ```

```
To lint:
```
npm run lint
```
and to run unit tests:
```
npm test
```
if you need to rebuild lib-jitsi-meet.min.js
```
npm run postinstall
```
Both linting and units will also be done by a pre-commit hook.

@@ -23,3 +23,3 @@ const RTCEvents = {

IS_SELECTED_CHANGED: 'rtc.is_selected_change',
SENDER_VIDEO_CONSTRAINTS_CHANGED: 'rtc.sender_video_constraints_changed',

@@ -41,2 +41,7 @@ /**

/**
* The max enabled resolution of a local video track was changed.
*/
LOCAL_TRACK_MAX_ENABLED_RESOLUTION_CHANGED: 'rtc.local_track_max_enabled_resolution_changed',
TRACK_ATTACHED: 'rtc.track_attached',

@@ -43,0 +48,0 @@

@@ -33,1 +33,6 @@ /**

export const CONNECTION_STATS = 'statistics.connectionstats';
/**
* An event carrying performance stats.
*/
export const LONG_TASKS_STATS = 'statistics.long_tasks_stats';

@@ -111,2 +111,6 @@ const XMPPEvents = {

// Designates an event indicating that an invite XMPP message in the MUC was
// received.
INVITE_MESSAGE_RECEIVED: 'xmpp.invite_message_received',
// Designates an event indicating that a private XMPP message in the MUC was

@@ -131,2 +135,14 @@ // received.

// Designates an event indicating that a participant joined the lobby XMPP MUC.
MUC_LOBBY_MEMBER_JOINED: 'xmpp.muc_lobby_member_joined',
// Designates an event indicating that a participant in the lobby XMPP MUC has been updated
MUC_LOBBY_MEMBER_UPDATED: 'xmpp.muc_lobby_member_updated',
// Designates an event indicating that a participant left the XMPP MUC.
MUC_LOBBY_MEMBER_LEFT: 'xmpp.muc_lobby_member_left',
// Designates an event indicating that a participant was denied access to a conference from the lobby XMPP MUC.
MUC_DENIED_ACCESS: 'xmpp.muc_denied access',
// Designates an event indicating that local participant left the muc

@@ -142,2 +158,5 @@ MUC_LEFT: 'xmpp.muc_left',

// Designates an event indicating that the MUC members only config has changed.
MUC_MEMBERS_ONLY_CHANGED: 'xmpp.muc_members_only_changed',
// Designates an event indicating that a participant in the XMPP MUC has

@@ -162,3 +181,2 @@ // advertised that they have audio muted (or unmuted).

PASSWORD_REQUIRED: 'xmpp.password_required',
PEERCONNECTION_READY: 'xmpp.peerconnection_ready',

@@ -193,2 +211,3 @@ /**

ROOM_JOIN_ERROR: 'xmpp.room_join_error',
ROOM_CONNECT_MEMBERS_ONLY_ERROR: 'xmpp.room_connect_error.members_only',

@@ -195,0 +214,0 @@ /**

@@ -1,82 +0,5 @@

/* global __dirname */
const process = require('process');
const { BundleAnalyzerPlugin } = require('webpack-bundle-analyzer');
const analyzeBundle = process.argv.indexOf('--analyze-bundle') !== -1;
const config = require('./webpack-shared-config');
const minimize
= process.argv.indexOf('-p') !== -1
|| process.argv.indexOf('--optimize-minimize') !== -1;
const config = {
devtool: 'source-map',
mode: minimize ? 'production' : 'development',
module: {
rules: [ {
// Version this build of the lib-jitsi-meet library.
loader: 'string-replace-loader',
options: {
flags: 'g',
replace:
process.env.LIB_JITSI_MEET_COMMIT_HASH || 'development',
search: '{#COMMIT_HASH#}'
},
test: `${__dirname}/JitsiMeetJS.js`
}, {
// Transpile ES2015 (aka ES6) to ES5.
exclude: [
new RegExp(`${__dirname}/node_modules/(?!@jitsi/js-utils)`)
],
loader: 'babel-loader',
options: {
presets: [
[
'@babel/preset-env',
// Tell babel to avoid compiling imports into CommonJS
// so that webpack may do tree shaking.
{ modules: false }
],
'@babel/preset-flow'
],
plugins: [
'@babel/plugin-transform-flow-strip-types',
'@babel/plugin-proposal-class-properties',
'@babel/plugin-proposal-export-namespace-from'
]
},
test: /\.js$/
} ]
},
node: {
// Allow the use of the real filename of the module being executed. By
// default Webpack does not leak path-related information and provides a
// value that is a mock (/index.js).
__filename: true
},
optimization: {
concatenateModules: minimize
},
output: {
filename: `[name]${minimize ? '.min' : ''}.js`,
path: process.cwd(),
sourceMapFilename: `[name].${minimize ? 'min' : 'js'}.map`
},
performance: {
hints: minimize ? 'error' : false,
maxAssetSize: 750 * 1024,
maxEntrypointSize: 750 * 1024
},
plugins: [
analyzeBundle
&& new BundleAnalyzerPlugin({
analyzerMode: 'disabled',
generateStatsFile: true
})
].filter(Boolean)
};
module.exports = [

@@ -91,3 +14,16 @@ Object.assign({}, config, {

})
})
}),
{
entry: {
worker: './modules/e2ee/Worker.js'
},
mode: 'production',
output: {
filename: 'lib-jitsi-meet.e2ee-worker.js',
path: process.cwd()
},
optimization: {
minimize: false
}
}
];

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is too big to display

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc