js-synthesizer
js-synthesizer is a library that generates audio data (frames). WebAssembly (wasm) version of FluidSynth is used as a core synthesizer engine.
Demo
https://www.pg-fl.jp/music/js-synthesizer/index.en.htm
Install
npm install --save js-synthesizer
Usage
From main thread
Copies dist/js-synthesizer.js
(or dist/js-synthesizer.min.js
) and externals/libfluidsynth-2.3.0.js
(libfluidsynth JS file) to your project, and writes <script>
tags as following order:
<script src="libfluidsynth-2.3.0.js"></script>
<script src="js-synthesizer.js"></script>
If you want to use Soundfont version 3 (.sf3) files, use externals/libfluidsynth-2.3.0-with-libsndfile.js
instead of externals/libfluidsynth-2.3.0.js
.
When scripts are available, please check whether waitForReady
resolves.
JSSynth.waitForReady().then(loadSynthesizer);
function loadSynthesizer() {
}
After initialized, you can use APIs via JSSynth
namespace object.
var context = new AudioContext();
var synth = new JSSynth.Synthesizer();
synth.init(context.sampleRate);
var node = synth.createAudioNode(context, 8192);
node.connect(context.destination);
synth.loadSFont(sfontBuffer).then(function () {
return synth.addSMFDataToPlayer(smfBuffer);
}).then(function () {
return synth.playPlayer();
}).then(function () {
return synth.waitForPlayerStopped();
}).then(function () {
return synth.waitForVoicesStopped();
}).then(function () {
synth.close();
}, function (err) {
console.log('Failed:', err);
synth.close();
});
(Above example uses Web Audio API, but you can use Synthesizer
without Web Audio, by using render()
method.)
js-synthesizer is built as UMD module. If you prefer to load js-synthesizer as a CommonJS / ES module, you can use import
statement such as import * as JSSynth from 'js-synthesizer'
by using bundlers (such as webpack).
Notes:
js-synthesizer.js
intends the ES2015-supported environment. If you need to run the script without errors on non-ES2015 environment such as IE11 (to notify 'unsupported'), you should load those scripts dynamically, or use transpiler such as babel.- When just after the scripts loaded, some APIs may fail since libfluidsynth is not ready. To avoid this, you can use the Promise object returned by
JSSynth.waitForReady
as above example. - libfluidsynth JS file is not
import
-able and its license (LGPL v2.1) is different from js-synthesizer's (BSD-3-Clause).
With AudioWorklet
js-synthesizer supports AudioWorklet process via dist/js-synthesizer.worklet.js
(or dist/js-synthesizer.worklet.min.js
). You can load js-synthesizer on the AudioWorklet as the following code:
var context = new AudioContext();
context.audioWorklet.addModule('libfluidsynth-2.3.0.js')
.then(function () {
return context.audioWorklet.addModule('js-synthesizer.worklet.js');
})
.then(function () {
var synth = new JSSynth.AudioWorkletNodeSynthesizer();
synth.init(context.sampleRate);
audioNode = synth.createAudioNode(context);
audioNode.connect(context.destination);
return synth.loadSFont(sfontBuffer).then(function () {
return synth.addSMFDataToPlayer(smfBuffer);
}).then(function () {
return synth.playPlayer();
}).then(function () {
...
});
});
With Web Worker
js-synthesizer and libfluidsynth can be executed on a Web Worker. Executing on a Web Worker prevents from blocking main thread while rendering.
To use js-synthesizer on a Web Worker, simply call importScripts
as followings:
self.importScripts('libfluidsynth-2.3.0.js');
self.importScripts('js-synthesizer.js');
Note that since the Web Audio is not supported on the Web Worker, the APIs/methods related to the Web Audio will not work. If you want to use both Web Worker and AudioWorklet, you should implement AudioWorkletProcessor manually as followings:
- main thread -- create AudioWorkletNode and establish connections between Web Worker and AudioWorklet
- You need to transfer rendered audio frames from Web Worker to AudioWorklet, but AudioWorklet environment does not support creating Web Worker. By creating
MessageChannel
and sending its port instances to Web Worker and AudioWorklet, they can communicate each other directly.
- Web Worker thread -- render audio frames into raw buffers and send it for AudioWorklet thread
- AudioWorklet thread -- receive audio frames (and queue) and 'render' it in the
process
method
API
Creation of Synthesizer instance
These classes implement the interface named JSSynth.ISynthesizer
.
JSSynth.Synthesizer
(construct: new JSSynth.Synthesizer()
)
- Creates the general synthesizer instance. No parameters are available.
JSSynth.AudioWorkletNodeSynthesizer
(construct: new JSSynth.AudioWorkletNodeSynthesizer()
)
- Creates the synthesizer instance communicating AudioWorklet (see above). No parameters are available.
- You must call
createAudioNode
method first to use other instance methods.
Creation of Sequencer instance
The Sequencer
instance is created only via following methods:
JSSynth.Synthesizer.createSequencer
(static method)
- Returns the Promise object that resolves with
JSSynth.ISequencer
instance. The instance can be used with JSSynth.Synthesizer
instances.
JSSynth.AudioWorkletNodeSynthesizer.prototype.createSequencer
(instance method)
- Returns the Promise object that resolves with
JSSynth.ISequencer
instance. The instance can be used with JSSynth.AudioWorkletNodeSynthesizer
instances which handled createSequencer
calls.
Using hook / handle MIDI-related event data with user-defined callback
You can hook MIDI events posted by player. For JSSynth.Synthesizer
instance, use hookPlayerMIDIEvents
method as followings:
syn.hookPlayerMIDIEvents(function (s, type, event) {
if (type === 0xC0) {
if (event.getProgram() === 0) {
syn.midiProgramSelect(event.getChannel(), secondSFont, 0, 0);
return true;
}
}
return false;
});
For JSSynth.AudioWorkletNodeSynthesizer
instance, use hookPlayerMIDIEventsByName
as followings:
AudioWorkletGlobalScope.myHookPlayerEvents = function (s, type, event, data) {
if (type === 0xC0) {
if (event.getProgram() === 0) {
s.midiProgramSelect(event.getChannel(), data.secondSFont, 0, 0);
return true;
}
}
return false;
};
syn.hookPlayerMIDIEventsByName('myHookPlayerEvents', { secondSFont: secondSFont });
The sequencer also supports 'user-defined client' to handle event data.
- For sequncer instance created by
Synthesizer.createSequencer
, use Synthesizer.registerSequencerClient
static method.
- You can use
Synthesizer.sendEventNow
static method to send event data, processed by the synthesizer or clients, to another clients/synthesizers.
- For sequncer instance created by
createSequencer
of AudioWorkletNodeSynthesizer
, use registerSequencerClientByName
instance method.
- The callback function must be added to 'AudioWorkletGlobalScope' like
hookPlayerMIDIEventsByName
's callback. - To re-send event data, use
Synthesizer.sendEventNow
in the worklet. Synthesizer
constructor is available via AudioWorkletGlobalScope.JSSynth.Synthesizer
.
- You can rewrite event data passed to the callback, by using
JSSynth.rewriteEventData
(AudioWorkletGlobalScope.JSSynth.rewriteEventData
for worklet).
JSSynth
methods
waitForReady
Can be used to wait for the synthesizer engine's ready.
Return: Promise
object (resolves when the synthesizer engine (libfluidsynth) is ready)
disableLogging
/ restoreLogging
Can be used to suppress logs from libfluidsynth.
JSSynth.ISynthesizer
methods
(Not documented yet. Please see dist/lib/ISynthesizer.d.ts
.)
License
js-synthesizer is licensed under BSD 3-Clause License except for the files in externals
directory.
For licenses of the files in externals
directory, please read externals/README.md
.