Channels
Channels
is a channel based sound player for the web.
Installation
npm install @mediamonks/channels
Quick start
import { Channels } from '@mediamonks/channels';
const soundFiles = ['sound1', 'sound2'].map(name => ({
name,
}));
const channelsInstance = new Channels({
soundsExtension: 'mp3',
soundsPath: 'static/audio/',
sounds: soundFiles,
});
await channelsInstance.loadSounds();
channelsInstance.play('sound1');
channelsInstance.createChannel('background-music');
channelsInstance.play('sound2', {channel: 'background-music'});
const myChannel = channelsInstance.createChannel('ui-sounds');
myChannel.play('sound2');
const sound1 = channelsInstance.play('sound1');
sound1.stop();
channelsInstance.stopAll();
channelsInstance.stopAll({channel: 'background-music'});
myChannel.stopAll();
channelsInstance.setVolume(0.5);
myChannel.setVolume(0.5);
Getting started
Before we can do anything, an instance of Channels
has to be created. Two parameters are required: the location of the sound files, and the file extension to use:
new Channels({
soundsPath: 'location/of/your/files',
soundsExtension: 'mp3',
})
Don't create more than one Channels
instance.
Optionally, an audioContext
can be passed in the constructor options. If omitted, one will be created in the Channels
constructor.
new Channels({
soundsPath,
soundsExtension,
audioContext: myAudioContext,
})
React
For React projects, there is a use-channels hook to easily work with a Channels
instance.
Suspended state
An AudioContext
created without user interaction (for example a click) will be in the suspended
state, in which no sound can be produced. This can happen for example if a Channels
instance is created on page landing without supplying a (non-suspended) audioContext
, since one will be created then automatically.
Creating a Channels
instance this way is fine by itself, just make sure to resume the context once on user interaction before playing any sounds.
const onClick = async () => {
await channelsInstance.resumeContext();
}
TLDR: The audioContext
that is used must have been created or resumed on user interaction.
To check whether the context is suspended: channelsInstance.contextIsSuspended
Loading files
Channels
uses the sample-manager for dealing with files, and creates an instance of it named sampleManager
.
channelsInstance.sampleManager
The easiest way to load files is to supply a list of objects with a name
property, matching the filenames without extension. The file extension has to be set when creating the Channels
object (which allows for an easy switch to different filetypes on certain clients).
const soundFiles = [{name: 'sound1'}, {name: 'sound2'}];
const channelsInstance = new Channels({
soundsPath: 'soundfiles/',
soundsExtension: 'mp3',
sounds: soundFiles,
})
channelsInstance.sampleManager.addSamples(soundFiles);
await channelsInstance.loadSounds();
await channelsInstance.loadSounds((progress) => {
});
The loadSounds
method is an alias for sampleManager.loadAllSamples
For more info on how to define sound files, please refer to the sample-manager page.
Playing a sound
When a sound has been loaded, it can be played by referring to its unique name
:
channelsInstance.play('sound');
A second argument can be passed with optional properties:
channelsInstance.play('sound', {
volume: 0.5,
channel: 'channel1',
loop: true,
fadeInTime: 2,
pan: 1,
startTimeOffset: 0,
effects: {
preVolume: myEffectsChain,
}
});
The play function returns a reference to the playing sound, containing various methods to interact with the sound.
const sound = channelsInstance.play('sound');
sound.setVolume(0.5);
Stopping a sound
Stopping a sound can be done by calling stop()
on the playing sound reference.
const playingSound = channelsInstance.play('sound');
playingSound.stop();
Sounds can be faded out before stopping by providing a fadeOutTime
playingSound.stop({ fadeOutTime: 2 });
Channels
Channels are a way of grouping sounds that are played. They have their own volume and optional effects, and their output connects to the main output. They are completely optional and might not be needed at all, since sounds can also be played without a channel.
The reason to create a channel is to easily manage a group of sounds, for example to:
- change their volume
- apply effects
- fade out
- stop all of them
Creating a channel
The only thing needed to create a channel is a unique name:
channelsInstance.createChannel('my-channel');
Second parameter can be used for some optional properties.
channelsInstance.createChannel('my-channel',{
type: 'monophonic',
volume: 0.5,
pan: 1,
effects: {
preVolume: myEffectsChain,
}
});
Check the Audio Effects section for more information about the effects
option.
A reference to a channel is returned when creating it, or can be retrieved afterwards.
const myChannel = channelsInstance.createChannel('my-channel');
const myChannel = channelsInstance.getChannel('my-channel');
Monophonic vs polyphonic
A Channel
can be either polyphonic or monophonic, which defines how many sounds can be played simultaneously on a channel:
- A
monophonic
channel can play one sound at a time. When playing a sound on such a channel, all other sounds on that channel will be stopped - A
polyphonic
channel has no restrictions
The term monophonic
is used loosely. Since sounds can fade in and out, even on a monophonic channel multiple sounds may be audible at the same time.
This type
can be set during creation. When no type
is given, the default polyphonic
is used.
channelsInstance.createChannel('monophonic-channel', {type: "monophonic"});
channelsInstance.createChannel('polyphonic-channel');
Using a monophonic channel can be very helpful when creating a background music layer where the music loop needs to be changed now and then.
Playing a sound on a channel
There are two ways to play a sound on a channel. First of all, directly on the channelsInstance
:
channelsInstance.play('mysound', { channel: 'mychannel'});
Or, if you happen to have a reference to an actual channel:
myChannel.play('my-sound');
All options for the play()
method can still be used, except for the (in this context useless) channel
prop.
myChannel.play('sound', {
volume: 0.5,
loop: true,
fadeInTime: 2,
pan: 1,
effects: {
preVolume: myEffectsChain,
}
});
Stopping all sounds on a channel
channelsInstance.stopAll({channel: 'channel-name'});
myChannel.stopAll();
channelsInstance.stopAll({ channel: 'channel-name', immediate: false });
myChannel.stopAll({ immediate: false });
Default play/stop options
Channels can have default options to use when calling play()
or stop()
for sounds playing on that channel. This can be used for example to create a channel on which every sound automatically always loops when played.
These default options are the combination of the options for play()
and stop()
, without the channel.
const sound = channelsInstance.play({
loop: true,
fadeInTime: 1,
volume: 0.5,
channel: 'my-channel',
pan: -1,
effects: {
preVolume: myEffectsChain,
}
});
sound.stop({
fadeOutTime: 1,
})
const defaultStartStopProps = {
loop: true,
fadeInTime: 1,
volume: 0.5,
fadeOutTime: 1,
pan: -1,
effects: {
preVolume: myEffectsChain,
}
};
channelsInstance.createChannel('my-channel', { defaultStartStopProps });
myChannel.defaultStartStopProps = defaultStartStopProps;
Passing props toplay()
or stop()
will override the defaultStartStopProps of a channel.
Default props (in combination with a monophonic
channel) can be very helpful when creating a background music layer with music loops that need to change now and then:
const channel = channelsInstance.createChannel('background-music', {
type: 'monophonic',
defaultStartStopProps: { fadeInTime: 2, fadeOutTime: 2, loop: true },
});
channel.play('loop1');
channel.play('loop2');
Signal modifiers
A SignalModifier
is something that allows the audio signal to be changed, for example to set the volume. These SignalModifiers
exist three places:
- On a sound
- On a channel
- On the main output
Overall structure
Everything in Channels
connects to the main output SignalModifier
, which is the final step before going to the actual output. A channel has its own SignalModifer
, which connects to the main SignalModifier
. In the following image, the SignalModifiers
are the blue blocks:
Sounds can be played either on a channel, or directly on the main output.
Sound structure
Sounds themselves also have a SignalModifier
:
SignalModifier structure
A SignalModifier
always contains the following nodes:
- a
GainNode
for setting volume - a separate
GainNode
for applying fades - a
StereoPannerNode
to pan the sound left or right
Optionally, custom effects chains can be added before or after these nodes.
Modifying the signal
The three places that have a SignalModifier
(sound, channel or main output) all have a set of methods implemented:
const myChannel = channelsInstance.getChannel('my-channel');
myChannel.setVolume(0.5);
myChannel.getVolume();
myChannel.mute();
myChannel.unmute();
myChannel.fadeOut(1);
myChannel.fadeIn(1);
myChannel.setPan(1);
myChannel.getPan();
const playingSound = channelsInstance.play('my-sound');
playingSound.setVolume(0.5);
channelsInstance.setVolume(0.5);
Volume values should be 0
or higher. Keep in mind that going beyond 1
might result in digital clipping.
When calling mute()
the volume
will be set to 0
, with the additional effect that the previous volume value will be stored and used when calling unmute()
Pan values should be between -1
(left) and 1
(right).
Events
There are a few things in Channels
that dispatch events. First of all: volume and pan changes on either a channel, a sound or the main instance.
channelsInstance.addEventListener('VOLUME_CHANGE', (event) => {
console.log(event.data.volume);
})
channel.addEventListener('VOLUME_CHANGE', (event) => {
console.log(event.data.volume);
})
playingSound.addEventListener('VOLUME_CHANGE', (event) => {
console.log(event.data.volume);
})
channel.addEventListener('PAN_CHANGE', (event) => {
console.log(event.data.pan);
})
When using React, there are some hooks available to work with volume or pan changes.
Apart from that, the main instance notifies when the list of channels or playing sounds updates:
channelsInstance.addEventListener('CHANNELS_CHANGE', () => {
console.log(channelsInstance.getChannels());
})
channelsInstance.addEventListener('PLAYING_SOUNDS_CHANGE', () => {
console.log(channelsInstance.getPlayingSounds());
})
Audio effects
Effects can be placed in the chain of a SignalModifier
and are therefor available on the three places that have a SignalModifier
:
- on the main output
- on a channel
- on a playing sound
In all of these cases, effects can be inserted before and/or after the volume is applied.
Effects have to be defined as an EffectsChain
, which is an object with an input
and an output
(both of type AudioNode
). An effect can be a single node (with input
and output
pointing to the same AudioNode
), or a long chain or multiple nodes.
Once an EffectsChain
has been created, it can be used in the preVolume
or postVolume
prop of the effects
prop.
const filter = audioContext.createBiquadFilter();
const myEffectsChain = {
input: filter,
output: filter,
}
const channelsInstance = new Channels({
soundsExtension,
soundsPath,
effects: {
preVolume: myEffectsChain
},
});
channelsInstance.createChannel('effect-channel', {
effects: {
postVolume: myEffectsChain,
}
})
channelsInstance.play('my-sound', {
effects: {
preVolume: myEffectsChain,
postVolume: myOtherEffectsChain,
}
})
Both input and output use the in/out with index 0
to connect, which will work for nearly all cases. If for some reason you need to use a different index, you can add a GainNode
before/after your effects as a solution.
Use <audio> or <video> output
It is possible to route the audio from an <audio>
or <video>
element into Channels
, for example to apply effects or to control their volume along with other sounds.
To do so, use the connectMediaElement()
method on either a channel or the main instance:
myChannel.connectMediaElement(myVideoElement);
channelsInstance.connectMediaElement(myVideoElement);