📹🎙️Node.js realtime SDK for LiveKit

Use this SDK to add realtime video, audio and data features to your Node app. By connecting to a self- or cloud-hosted LiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code.
This SDK is currently in Developer Preview mode and not ready for production use. There will be bugs and APIs may change during this period.
We welcome and appreciate any feedback or contributions. You can create issues here or chat live with us in the #dev channel within the LiveKit Community Slack.
Using realtime SDK
Connecting to a room
import {
RemoteParticipant,
RemoteTrack,
RemoteTrackPublication,
Room,
RoomEvent,
dispose,
} from '@livekit/rtc-node';
const room = new Room();
await room.connect(url, token, { autoSubscribe: true, dynacast: true });
console.log('connected to room', room);
room
.on(RoomEvent.TrackSubscribed, handleTrackSubscribed)
.on(RoomEvent.Disconnected, handleDisconnected)
.on(RoomEvent.LocalTrackPublished, handleLocalTrackPublished);
process.on('SIGINT', () => {
await room.disconnect();
await dispose();
});
Publishing a track
import {
AudioFrame,
AudioSource,
LocalAudioTrack,
TrackPublishOptions,
TrackSource,
} from '@livekit/rtc-node';
import { readFileSync } from 'node:fs';
const source = new AudioSource(16000, 1);
const track = LocalAudioTrack.createAudioTrack('audio', source);
const options = new TrackPublishOptions();
options.source = TrackSource.SOURCE_MICROPHONE;
const sample = readFileSync(pathToFile);
var buffer = new Uint16Array(sample.buffer);
await room.localParticipant.publishTrack(track, options);
await source.captureFrame(new AudioFrame(buffer, 16000, 1, buffer.byteLength / 2));
Examples
Getting help / Contributing
Please join us on Slack to get help from our devs/community. We welcome your contributions and details can be discussed there.