New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

livekit-client

Package Overview
Dependencies
Maintainers
1
Versions
239
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

livekit-client - npm Package Compare versions

Comparing version 0.11.0 to 0.11.1

8

dist/api/SignalClient.d.ts
import 'webrtc-adapter';
import { ParticipantInfo, TrackType } from '../proto/livekit_models';
import { JoinResponse, SignalRequest, SignalTarget, SpeakerInfo, TrackPublishedResponse, UpdateSubscription, UpdateTrackSettings, VideoQuality } from '../proto/livekit_rtc';
import { ParticipantInfo } from '../proto/livekit_models';
import { AddTrackRequest, JoinResponse, SignalRequest, SignalTarget, SpeakerInfo, TrackPublishedResponse, UpdateSubscription, UpdateTrackSettings, VideoQuality } from '../proto/livekit_rtc';
import { Track } from '../room/track/Track';

@@ -25,3 +25,3 @@ interface ConnectOpts {

sendMuteTrack(trackSid: string, muted: boolean): void;
sendAddTrack(cid: string, name: string, type: TrackType, dimensions?: Track.Dimensions): void;
sendAddTrack(req: AddTrackRequest): void;
sendUpdateTrackSettings(settings: UpdateTrackSettings): void;

@@ -64,3 +64,3 @@ sendUpdateSubscription(sub: UpdateSubscription): void;

sendMuteTrack(trackSid: string, muted: boolean): void;
sendAddTrack(cid: string, name: string, type: TrackType, dimensions?: Track.Dimensions): void;
sendAddTrack(req: AddTrackRequest): void;
sendUpdateTrackSettings(settings: UpdateTrackSettings): void;

@@ -67,0 +67,0 @@ sendUpdateSubscription(sub: UpdateSubscription): void;

@@ -43,2 +43,4 @@ "use strict";

connect(url, token, opts) {
// strip trailing slash
url = url.replace(/\/$/, '');
url += '/rtc';

@@ -163,12 +165,3 @@ let params = `?access_token=${token}&protocol=${version_1.protocolVersion}`;

}
sendAddTrack(cid, name, type, dimensions) {
const req = {
cid,
name,
type,
};
if (dimensions) {
req.width = dimensions.width;
req.height = dimensions.height;
}
sendAddTrack(req) {
this.sendRequest({

@@ -175,0 +168,0 @@ addTrack: livekit_rtc_1.AddTrackRequest.fromPartial(req),

@@ -6,3 +6,3 @@ import { ConnectOptions } from './options';

import LocalVideoTrack from './room/track/LocalVideoTrack';
import { CreateAudioTrackOptions, CreateLocalTracksOptions, CreateVideoTrackOptions } from './room/track/options';
import { CreateAudioTrackOptions, CreateLocalTracksOptions, CreateScreenTrackOptions, CreateVideoTrackOptions } from './room/track/options';
export { version } from './version';

@@ -36,2 +36,6 @@ /**

/**
* Creates a [[LocalVideoTrack]] of screen capture with getDisplayMedia()
*/
export declare function createLocalScreenTrack(options?: CreateScreenTrackOptions): Promise<LocalVideoTrack>;
/**
* creates a local video and audio track at the same time

@@ -38,0 +42,0 @@ * @param options

@@ -15,3 +15,3 @@ "use strict";

Object.defineProperty(exports, "__esModule", { value: true });
exports.createLocalTracks = exports.createLocalAudioTrack = exports.createLocalVideoTrack = exports.connect = exports.version = void 0;
exports.createLocalTracks = exports.createLocalScreenTrack = exports.createLocalAudioTrack = exports.createLocalVideoTrack = exports.connect = exports.version = void 0;
const loglevel_1 = __importDefault(require("loglevel"));

@@ -25,2 +25,3 @@ const SignalClient_1 = require("./api/SignalClient");

const LocalVideoTrack_1 = __importDefault(require("./room/track/LocalVideoTrack"));
const options_2 = require("./room/track/options");
const Track_1 = require("./room/track/Track");

@@ -125,2 +126,33 @@ var version_1 = require("./version");

/**
* Creates a [[LocalVideoTrack]] of screen capture with getDisplayMedia()
*/
function createLocalScreenTrack(options) {
return __awaiter(this, void 0, void 0, function* () {
if (options === undefined) {
options = {};
}
if (options.name === undefined) {
options.name = 'screen';
}
if (options.resolution === undefined) {
options.resolution = options_2.VideoPresets.fhd.resolution;
}
// typescript definition is missing getDisplayMedia: https://github.com/microsoft/TypeScript/issues/33232
// @ts-ignore
const stream = yield navigator.mediaDevices.getDisplayMedia({
audio: false,
video: {
width: options.resolution.width,
height: options.resolution.height,
},
});
const tracks = stream.getVideoTracks();
if (tracks.length === 0) {
throw new errors_1.TrackInvalidError('no video track found');
}
return new LocalVideoTrack_1.default(tracks[0], options.name);
});
}
exports.createLocalScreenTrack = createLocalScreenTrack;
/**
* creates a local video and audio track at the same time

@@ -127,0 +159,0 @@ * @param options

@@ -32,2 +32,4 @@ import _m0 from "protobufjs/minimal";

joinedAt: number;
/** hidden participant (used for recording) */
hidden: boolean;
}

@@ -67,2 +69,29 @@ export declare enum ParticipantInfo_State {

}
export interface RecordingTemplate {
layout: string;
wsUrl: string;
/** either token or room name required */
token: string;
roomName: string;
}
export interface RecordingS3Output {
bucket: string;
key: string;
/** optional */
accessKey: string;
secret: string;
}
export interface RecordingOptions {
/** 720p30, 720p60, 1080p30, or 1080p60 */
preset: string;
inputWidth: number;
inputHeight: number;
outputWidth: number;
outputHeight: number;
depth: number;
framerate: number;
audioBitrate: number;
audioFrequency: number;
videoBitrate: number;
}
export declare const Room: {

@@ -103,2 +132,23 @@ encode(message: Room, writer?: _m0.Writer): _m0.Writer;

};
export declare const RecordingTemplate: {
encode(message: RecordingTemplate, writer?: _m0.Writer): _m0.Writer;
decode(input: _m0.Reader | Uint8Array, length?: number | undefined): RecordingTemplate;
fromJSON(object: any): RecordingTemplate;
toJSON(message: RecordingTemplate): unknown;
fromPartial(object: DeepPartial<RecordingTemplate>): RecordingTemplate;
};
export declare const RecordingS3Output: {
encode(message: RecordingS3Output, writer?: _m0.Writer): _m0.Writer;
decode(input: _m0.Reader | Uint8Array, length?: number | undefined): RecordingS3Output;
fromJSON(object: any): RecordingS3Output;
toJSON(message: RecordingS3Output): unknown;
fromPartial(object: DeepPartial<RecordingS3Output>): RecordingS3Output;
};
export declare const RecordingOptions: {
encode(message: RecordingOptions, writer?: _m0.Writer): _m0.Writer;
decode(input: _m0.Reader | Uint8Array, length?: number | undefined): RecordingOptions;
fromJSON(object: any): RecordingOptions;
toJSON(message: RecordingOptions): unknown;
fromPartial(object: DeepPartial<RecordingOptions>): RecordingOptions;
};
declare type Builtin = Date | Function | Uint8Array | string | number | undefined;

@@ -105,0 +155,0 @@ export declare type DeepPartial<T> = T extends Builtin ? T : T extends Array<infer U> ? Array<DeepPartial<U>> : T extends ReadonlyArray<infer U> ? ReadonlyArray<DeepPartial<U>> : T extends {} ? {

@@ -6,3 +6,3 @@ "use strict";

Object.defineProperty(exports, "__esModule", { value: true });
exports.DataMessage = exports.TrackInfo = exports.ParticipantInfo = exports.Codec = exports.Room = exports.participantInfo_StateToJSON = exports.participantInfo_StateFromJSON = exports.ParticipantInfo_State = exports.trackTypeToJSON = exports.trackTypeFromJSON = exports.TrackType = exports.protobufPackage = void 0;
exports.RecordingOptions = exports.RecordingS3Output = exports.RecordingTemplate = exports.DataMessage = exports.TrackInfo = exports.ParticipantInfo = exports.Codec = exports.Room = exports.participantInfo_StateToJSON = exports.participantInfo_StateFromJSON = exports.ParticipantInfo_State = exports.trackTypeToJSON = exports.trackTypeFromJSON = exports.TrackType = exports.protobufPackage = void 0;
/* eslint-disable */

@@ -358,2 +358,3 @@ const long_1 = __importDefault(require("long"));

joinedAt: 0,
hidden: false,
};

@@ -380,2 +381,5 @@ exports.ParticipantInfo = {

}
if (message.hidden === true) {
writer.uint32(56).bool(message.hidden);
}
return writer;

@@ -409,2 +413,5 @@ },

break;
case 7:
message.hidden = reader.bool();
break;
default:

@@ -455,2 +462,8 @@ reader.skipType(tag & 7);

}
if (object.hidden !== undefined && object.hidden !== null) {
message.hidden = Boolean(object.hidden);
}
else {
message.hidden = false;
}
return message;

@@ -472,2 +485,3 @@ },

message.joinedAt !== undefined && (obj.joinedAt = message.joinedAt);
message.hidden !== undefined && (obj.hidden = message.hidden);
return obj;

@@ -513,2 +527,8 @@ },

}
if (object.hidden !== undefined && object.hidden !== null) {
message.hidden = object.hidden;
}
else {
message.hidden = false;
}
return message;

@@ -761,2 +781,467 @@ },

};
const baseRecordingTemplate = {
layout: "",
wsUrl: "",
token: "",
roomName: "",
};
exports.RecordingTemplate = {
encode(message, writer = minimal_1.default.Writer.create()) {
if (message.layout !== "") {
writer.uint32(10).string(message.layout);
}
if (message.wsUrl !== "") {
writer.uint32(18).string(message.wsUrl);
}
if (message.token !== "") {
writer.uint32(26).string(message.token);
}
if (message.roomName !== "") {
writer.uint32(34).string(message.roomName);
}
return writer;
},
decode(input, length) {
const reader = input instanceof minimal_1.default.Reader ? input : new minimal_1.default.Reader(input);
let end = length === undefined ? reader.len : reader.pos + length;
const message = Object.assign({}, baseRecordingTemplate);
while (reader.pos < end) {
const tag = reader.uint32();
switch (tag >>> 3) {
case 1:
message.layout = reader.string();
break;
case 2:
message.wsUrl = reader.string();
break;
case 3:
message.token = reader.string();
break;
case 4:
message.roomName = reader.string();
break;
default:
reader.skipType(tag & 7);
break;
}
}
return message;
},
fromJSON(object) {
const message = Object.assign({}, baseRecordingTemplate);
if (object.layout !== undefined && object.layout !== null) {
message.layout = String(object.layout);
}
else {
message.layout = "";
}
if (object.wsUrl !== undefined && object.wsUrl !== null) {
message.wsUrl = String(object.wsUrl);
}
else {
message.wsUrl = "";
}
if (object.token !== undefined && object.token !== null) {
message.token = String(object.token);
}
else {
message.token = "";
}
if (object.roomName !== undefined && object.roomName !== null) {
message.roomName = String(object.roomName);
}
else {
message.roomName = "";
}
return message;
},
toJSON(message) {
const obj = {};
message.layout !== undefined && (obj.layout = message.layout);
message.wsUrl !== undefined && (obj.wsUrl = message.wsUrl);
message.token !== undefined && (obj.token = message.token);
message.roomName !== undefined && (obj.roomName = message.roomName);
return obj;
},
fromPartial(object) {
const message = Object.assign({}, baseRecordingTemplate);
if (object.layout !== undefined && object.layout !== null) {
message.layout = object.layout;
}
else {
message.layout = "";
}
if (object.wsUrl !== undefined && object.wsUrl !== null) {
message.wsUrl = object.wsUrl;
}
else {
message.wsUrl = "";
}
if (object.token !== undefined && object.token !== null) {
message.token = object.token;
}
else {
message.token = "";
}
if (object.roomName !== undefined && object.roomName !== null) {
message.roomName = object.roomName;
}
else {
message.roomName = "";
}
return message;
},
};
const baseRecordingS3Output = {
bucket: "",
key: "",
accessKey: "",
secret: "",
};
exports.RecordingS3Output = {
encode(message, writer = minimal_1.default.Writer.create()) {
if (message.bucket !== "") {
writer.uint32(10).string(message.bucket);
}
if (message.key !== "") {
writer.uint32(18).string(message.key);
}
if (message.accessKey !== "") {
writer.uint32(26).string(message.accessKey);
}
if (message.secret !== "") {
writer.uint32(34).string(message.secret);
}
return writer;
},
decode(input, length) {
const reader = input instanceof minimal_1.default.Reader ? input : new minimal_1.default.Reader(input);
let end = length === undefined ? reader.len : reader.pos + length;
const message = Object.assign({}, baseRecordingS3Output);
while (reader.pos < end) {
const tag = reader.uint32();
switch (tag >>> 3) {
case 1:
message.bucket = reader.string();
break;
case 2:
message.key = reader.string();
break;
case 3:
message.accessKey = reader.string();
break;
case 4:
message.secret = reader.string();
break;
default:
reader.skipType(tag & 7);
break;
}
}
return message;
},
fromJSON(object) {
const message = Object.assign({}, baseRecordingS3Output);
if (object.bucket !== undefined && object.bucket !== null) {
message.bucket = String(object.bucket);
}
else {
message.bucket = "";
}
if (object.key !== undefined && object.key !== null) {
message.key = String(object.key);
}
else {
message.key = "";
}
if (object.accessKey !== undefined && object.accessKey !== null) {
message.accessKey = String(object.accessKey);
}
else {
message.accessKey = "";
}
if (object.secret !== undefined && object.secret !== null) {
message.secret = String(object.secret);
}
else {
message.secret = "";
}
return message;
},
toJSON(message) {
const obj = {};
message.bucket !== undefined && (obj.bucket = message.bucket);
message.key !== undefined && (obj.key = message.key);
message.accessKey !== undefined && (obj.accessKey = message.accessKey);
message.secret !== undefined && (obj.secret = message.secret);
return obj;
},
fromPartial(object) {
const message = Object.assign({}, baseRecordingS3Output);
if (object.bucket !== undefined && object.bucket !== null) {
message.bucket = object.bucket;
}
else {
message.bucket = "";
}
if (object.key !== undefined && object.key !== null) {
message.key = object.key;
}
else {
message.key = "";
}
if (object.accessKey !== undefined && object.accessKey !== null) {
message.accessKey = object.accessKey;
}
else {
message.accessKey = "";
}
if (object.secret !== undefined && object.secret !== null) {
message.secret = object.secret;
}
else {
message.secret = "";
}
return message;
},
};
const baseRecordingOptions = {
preset: "",
inputWidth: 0,
inputHeight: 0,
outputWidth: 0,
outputHeight: 0,
depth: 0,
framerate: 0,
audioBitrate: 0,
audioFrequency: 0,
videoBitrate: 0,
};
exports.RecordingOptions = {
encode(message, writer = minimal_1.default.Writer.create()) {
if (message.preset !== "") {
writer.uint32(10).string(message.preset);
}
if (message.inputWidth !== 0) {
writer.uint32(16).int32(message.inputWidth);
}
if (message.inputHeight !== 0) {
writer.uint32(24).int32(message.inputHeight);
}
if (message.outputWidth !== 0) {
writer.uint32(32).int32(message.outputWidth);
}
if (message.outputHeight !== 0) {
writer.uint32(40).int32(message.outputHeight);
}
if (message.depth !== 0) {
writer.uint32(48).int32(message.depth);
}
if (message.framerate !== 0) {
writer.uint32(56).int32(message.framerate);
}
if (message.audioBitrate !== 0) {
writer.uint32(64).int32(message.audioBitrate);
}
if (message.audioFrequency !== 0) {
writer.uint32(72).int32(message.audioFrequency);
}
if (message.videoBitrate !== 0) {
writer.uint32(80).int32(message.videoBitrate);
}
return writer;
},
decode(input, length) {
const reader = input instanceof minimal_1.default.Reader ? input : new minimal_1.default.Reader(input);
let end = length === undefined ? reader.len : reader.pos + length;
const message = Object.assign({}, baseRecordingOptions);
while (reader.pos < end) {
const tag = reader.uint32();
switch (tag >>> 3) {
case 1:
message.preset = reader.string();
break;
case 2:
message.inputWidth = reader.int32();
break;
case 3:
message.inputHeight = reader.int32();
break;
case 4:
message.outputWidth = reader.int32();
break;
case 5:
message.outputHeight = reader.int32();
break;
case 6:
message.depth = reader.int32();
break;
case 7:
message.framerate = reader.int32();
break;
case 8:
message.audioBitrate = reader.int32();
break;
case 9:
message.audioFrequency = reader.int32();
break;
case 10:
message.videoBitrate = reader.int32();
break;
default:
reader.skipType(tag & 7);
break;
}
}
return message;
},
fromJSON(object) {
const message = Object.assign({}, baseRecordingOptions);
if (object.preset !== undefined && object.preset !== null) {
message.preset = String(object.preset);
}
else {
message.preset = "";
}
if (object.inputWidth !== undefined && object.inputWidth !== null) {
message.inputWidth = Number(object.inputWidth);
}
else {
message.inputWidth = 0;
}
if (object.inputHeight !== undefined && object.inputHeight !== null) {
message.inputHeight = Number(object.inputHeight);
}
else {
message.inputHeight = 0;
}
if (object.outputWidth !== undefined && object.outputWidth !== null) {
message.outputWidth = Number(object.outputWidth);
}
else {
message.outputWidth = 0;
}
if (object.outputHeight !== undefined && object.outputHeight !== null) {
message.outputHeight = Number(object.outputHeight);
}
else {
message.outputHeight = 0;
}
if (object.depth !== undefined && object.depth !== null) {
message.depth = Number(object.depth);
}
else {
message.depth = 0;
}
if (object.framerate !== undefined && object.framerate !== null) {
message.framerate = Number(object.framerate);
}
else {
message.framerate = 0;
}
if (object.audioBitrate !== undefined && object.audioBitrate !== null) {
message.audioBitrate = Number(object.audioBitrate);
}
else {
message.audioBitrate = 0;
}
if (object.audioFrequency !== undefined && object.audioFrequency !== null) {
message.audioFrequency = Number(object.audioFrequency);
}
else {
message.audioFrequency = 0;
}
if (object.videoBitrate !== undefined && object.videoBitrate !== null) {
message.videoBitrate = Number(object.videoBitrate);
}
else {
message.videoBitrate = 0;
}
return message;
},
toJSON(message) {
const obj = {};
message.preset !== undefined && (obj.preset = message.preset);
message.inputWidth !== undefined && (obj.inputWidth = message.inputWidth);
message.inputHeight !== undefined &&
(obj.inputHeight = message.inputHeight);
message.outputWidth !== undefined &&
(obj.outputWidth = message.outputWidth);
message.outputHeight !== undefined &&
(obj.outputHeight = message.outputHeight);
message.depth !== undefined && (obj.depth = message.depth);
message.framerate !== undefined && (obj.framerate = message.framerate);
message.audioBitrate !== undefined &&
(obj.audioBitrate = message.audioBitrate);
message.audioFrequency !== undefined &&
(obj.audioFrequency = message.audioFrequency);
message.videoBitrate !== undefined &&
(obj.videoBitrate = message.videoBitrate);
return obj;
},
fromPartial(object) {
const message = Object.assign({}, baseRecordingOptions);
if (object.preset !== undefined && object.preset !== null) {
message.preset = object.preset;
}
else {
message.preset = "";
}
if (object.inputWidth !== undefined && object.inputWidth !== null) {
message.inputWidth = object.inputWidth;
}
else {
message.inputWidth = 0;
}
if (object.inputHeight !== undefined && object.inputHeight !== null) {
message.inputHeight = object.inputHeight;
}
else {
message.inputHeight = 0;
}
if (object.outputWidth !== undefined && object.outputWidth !== null) {
message.outputWidth = object.outputWidth;
}
else {
message.outputWidth = 0;
}
if (object.outputHeight !== undefined && object.outputHeight !== null) {
message.outputHeight = object.outputHeight;
}
else {
message.outputHeight = 0;
}
if (object.depth !== undefined && object.depth !== null) {
message.depth = object.depth;
}
else {
message.depth = 0;
}
if (object.framerate !== undefined && object.framerate !== null) {
message.framerate = object.framerate;
}
else {
message.framerate = 0;
}
if (object.audioBitrate !== undefined && object.audioBitrate !== null) {
message.audioBitrate = object.audioBitrate;
}
else {
message.audioBitrate = 0;
}
if (object.audioFrequency !== undefined && object.audioFrequency !== null) {
message.audioFrequency = object.audioFrequency;
}
else {
message.audioFrequency = 0;
}
if (object.videoBitrate !== undefined && object.videoBitrate !== null) {
message.videoBitrate = object.videoBitrate;
}
else {
message.videoBitrate = 0;
}
return message;
},
};
var globalThis = (() => {

@@ -763,0 +1248,0 @@ if (typeof globalThis !== "undefined")

@@ -62,2 +62,4 @@ import _m0 from "protobufjs/minimal";

height: number;
/** true to add track and initialize to muted */
muted: boolean;
}

@@ -64,0 +66,0 @@ export interface TrickleRequest {

@@ -100,6 +100,19 @@ "use strict";

track.on(events_1.TrackEvent.Unmuted, this.onTrackUnmuted);
track.mediaStreamTrack.addEventListener('ended', () => {
this.unpublishTrack(track);
});
// get local track id for use during publishing
const cid = track.mediaStreamTrack.id;
// create track publication from track
const ti = yield this.engine.addTrack(cid, track.name, track.kind, track.dimensions);
const req = livekit_rtc_1.AddTrackRequest.fromPartial({
cid,
name: track.name,
type: Track_1.Track.kindToProto(track.kind),
muted: track.isMuted,
});
if (track.dimensions) {
req.width = track.dimensions.width;
req.height = track.dimensions.height;
}
const ti = yield this.engine.addTrack(req);
const publication = new LocalTrackPublication_1.default(track.kind, ti, track);

@@ -148,3 +161,2 @@ track.sid = ti.sid;

loglevel_1.default.debug('unpublishTrack', 'unpublishing track', track);
// TODO: add logging
if (!publication) {

@@ -295,4 +307,12 @@ loglevel_1.default.warn('unpublishTrack', 'track was not unpublished because no publication was found', track);

return;
const selected = cap.codecs.find((c) => c.mimeType.toLowerCase() === `video/${videoCodec}`
|| c.mimeType.toLowerCase() === 'audio/opus');
const selected = cap.codecs.find((c) => {
const codec = c.mimeType.toLowerCase();
const matchesVideoCodec = codec === `video/${videoCodec}`;
// for h264 codecs that have sdpFmtpLine available, use only if the
// profile-level-id is 42e01f for cross-browser compatibility
if (videoCodec === 'h264' && c.sdpFmtpLine) {
return matchesVideoCodec && c.sdpFmtpLine.includes('profile-level-id=42e01f');
}
return matchesVideoCodec || codec === 'audio/opus';
});
if (selected && 'setCodecPreferences' in transceiver) {

@@ -320,2 +340,3 @@ transceiver.setCodecPreferences([selected]);

maxBitrate: videoEncoding.maxBitrate,
/* @ts-ignore */
maxFramerate: videoEncoding.maxFramerate,

@@ -334,2 +355,3 @@ },

maxBitrate: midPreset.encoding.maxBitrate,
/* @ts-ignore */
maxFramerate: midPreset.encoding.maxFramerate,

@@ -341,2 +363,3 @@ });

maxBitrate: lowPreset.encoding.maxBitrate,
/* @ts-ignore */
maxFramerate: lowPreset.encoding.maxFramerate,

@@ -350,2 +373,3 @@ });

maxBitrate: lowPreset.encoding.maxBitrate,
/* @ts-ignore */
maxFramerate: lowPreset.encoding.maxFramerate,

@@ -352,0 +376,0 @@ });

@@ -33,2 +33,3 @@ "use strict";

this.pendingCandidates = [];
this.restartingIce = false;
});

@@ -35,0 +36,0 @@ }

@@ -39,2 +39,3 @@ /// <reference types="node" />

private audioEnabled;
private audioContext?;
/** @internal */

@@ -68,2 +69,3 @@ constructor(client: SignalClient, config?: RTCConfiguration);

private handleAudioPlaybackFailed;
private acquireAudioContext;
private getOrCreateParticipant;

@@ -70,0 +72,0 @@ /** @internal */

@@ -167,2 +167,3 @@ "use strict";

this.engine = new RTCEngine_1.default(client, config);
this.acquireAudioContext();
this.engine.on(events_2.EngineEvent.MediaTrackAdded, (mediaTrack, stream, receiver) => {

@@ -197,2 +198,3 @@ this.onTrackAdded(mediaTrack, stream, receiver);

return __awaiter(this, void 0, void 0, function* () {
this.acquireAudioContext();
const elements = [];

@@ -249,2 +251,6 @@ this.participants.forEach((p) => {

this.activeSpeakers = [];
if (this.audioContext) {
this.audioContext.close();
this.audioContext = undefined;
}
window.removeEventListener('beforeunload', this.disconnect);

@@ -290,2 +296,14 @@ this.emit(events_2.RoomEvent.Disconnected);

}
acquireAudioContext() {
if (this.audioContext) {
this.audioContext.close();
}
// by using an AudioContext, it reduces lag on audio elements
// https://stackoverflow.com/questions/9811429/html5-audio-tag-on-safari-has-a-delay/54119854#54119854
// @ts-ignore
const AudioContext = window.AudioContext || window.webkitAudioContext;
if (AudioContext) {
this.audioContext = new AudioContext();
}
}
getOrCreateParticipant(id, info) {

@@ -292,0 +310,0 @@ let participant = this.participants.get(id);

@@ -5,5 +5,4 @@ /// <reference types="node" />

import { TrackInfo } from '../proto/livekit_models';
import { JoinResponse } from '../proto/livekit_rtc';
import { AddTrackRequest, JoinResponse } from '../proto/livekit_rtc';
import PCTransport from './PCTransport';
import { Track } from './track/Track';
export default class RTCEngine extends EventEmitter {

@@ -28,3 +27,3 @@ publisher?: PCTransport;

close(): void;
addTrack(cid: string, name: string, kind: Track.Kind, dimension?: Track.Dimensions): Promise<TrackInfo>;
addTrack(req: AddTrackRequest): Promise<TrackInfo>;
updateMuteStatus(trackSid: string, muted: boolean): void;

@@ -31,0 +30,0 @@ private configure;

@@ -21,3 +21,2 @@ "use strict";

const PCTransport_1 = __importDefault(require("./PCTransport"));
const Track_1 = require("./track/Track");
const utils_1 = require("./utils");

@@ -144,9 +143,9 @@ const lossyDataChannel = '_lossy';

}
addTrack(cid, name, kind, dimension) {
if (this.pendingTrackResolvers[cid]) {
addTrack(req) {
if (this.pendingTrackResolvers[req.cid]) {
throw new errors_1.TrackInvalidError('a track with the same ID has already been published');
}
return new Promise((resolve) => {
this.pendingTrackResolvers[cid] = resolve;
this.client.sendAddTrack(cid, name, Track_1.Track.kindToProto(kind), dimension);
this.pendingTrackResolvers[req.cid] = resolve;
this.client.sendAddTrack(req);
});

@@ -311,3 +310,3 @@ }

}
yield utils_1.sleep(500);
yield utils_1.sleep(100);
}

@@ -314,0 +313,0 @@ // have not reconnected, throw

@@ -24,3 +24,2 @@ "use strict";

this.constraints = constraints !== null && constraints !== void 0 ? constraints : mediaTrack.getConstraints();
loglevel_1.default.debug('track created, constraints', this.constraints);
}

@@ -27,0 +26,0 @@ get id() {

@@ -57,2 +57,8 @@ /**

}
export interface CreateScreenTrackOptions {
/** name of track, defaults to "screen" */
name?: string;
/** capture resolution, defaults to full HD */
resolution?: VideoResolutionConstraint;
}
export interface CreateAudioTrackOptions extends CreateLocalTrackOptions {

@@ -59,0 +65,0 @@ /**

@@ -1,2 +0,2 @@

export declare const version = "0.11.0";
export declare const version = "0.11.1";
export declare const protocolVersion = 2;
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.protocolVersion = exports.version = void 0;
exports.version = '0.11.0';
exports.version = '0.11.1';
exports.protocolVersion = 2;
//# sourceMappingURL=version.js.map
import {
connect, CreateVideoTrackOptions,
connect, createLocalScreenTrack, CreateVideoTrackOptions,
LocalAudioTrack,

@@ -282,20 +282,6 @@ LocalTrack,

const preset = VideoPresets.hd;
// typescript definition is missing getDisplayMedia: https://github.com/microsoft/TypeScript/issues/33232
// @ts-ignore
const ssMediaStream: MediaStream = await navigator.mediaDevices.getDisplayMedia({
audio: false,
video: {
width: preset.resolution.width,
height: preset.resolution.height,
},
screenTrack = await createLocalScreenTrack();
await currentRoom.localParticipant.publishTrack(screenTrack, {
videoEncoding: VideoPresets.fhd.encoding,
});
for (const t of ssMediaStream.getTracks()) {
screenTrack = new LocalVideoTrack(t, 'screen');
await currentRoom.localParticipant.publishTrack(t, {
videoEncoding: { maxFramerate: 30, maxBitrate: 3000000 },
videoCodec: 'h264',
simulcast: false,
});
}
};

@@ -302,0 +288,0 @@

{
"name": "livekit-client",
"version": "0.11.0",
"version": "0.11.1",
"description": "JavaScript/TypeScript client SDK for LiveKit",

@@ -39,3 +39,3 @@ "main": "dist/index.js",

"typedoc-plugin-no-inherit": "^1.2.0",
"typescript": "^4.2.3",
"typescript": "~4.2.3",
"webpack": "^5.9.0",

@@ -42,0 +42,0 @@ "webpack-cli": "^4.2.0",

@@ -114,5 +114,5 @@ # JavaScript/TypeScript client SDK for LiveKit

Browsers can be restrictive regarding if audio could be played without user interaction. What each browser considers as user interaction can also be different (with Safari on iOS being the most restrictive). Some browser considers clicking on a button unrelated to audio as interaction, others require audio element's `play` function to be triggered by an onclick event.
Browsers can be restrictive with regards to audio playback that is not initiated by user interaction. What each browser considers as user interaction can vary by vendor (for example, Safari on iOS is very restrictive).
LiveKit will attempt to autoplay all audio tracks when you attach them to audio elements. However, if that fails, we'll notify you via `RoomEvent.AudioPlaybackStatusChanged`. `Room.canPlayAudio` will indicate if audio playback is permitted. (Note: LiveKit takes an optimistic approach so it's possible for this value to change from `true` to `false` when we encounter a browser error.
LiveKit will attempt to autoplay all audio tracks when you attach them to audio elements. However, if that fails, we'll notify you via `RoomEvent.AudioPlaybackStatusChanged`. `Room.canPlayAudio` will indicate if audio playback is permitted. LiveKit takes an optimistic approach so it's possible for this value to change from `true` to `false` when we encounter a browser error.

@@ -155,3 +155,3 @@ In the case user interaction is required, LiveKit provides `Room.startAudio` to start audio playback. This function must be triggered in an onclick or ontap event handler. In the same session, once audio playback is successful, additional audio tracks can be played without further user interactions.

This library uses (loglevel)[] for its internal logs. You can change the effective log level with the `logLevel` field in `ConnectOptions`.
This library uses [loglevel](https://github.com/pimterry/loglevel) for its internal logs. You can change the effective log level with the `logLevel` field in `ConnectOptions`.

@@ -158,0 +158,0 @@ ## Examples

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc