Smplr
A web audio sampler instrument
Install / Use
/learn @danigb/SmplrREADME
smplr
smplris a collection of sampled instruments for Web Audio API ready to be used with no setup required.
Examples:
import { Soundfont } from "smplr";
const context = new AudioContext();
const marimba = new Soundfont(context, { instrument: "marimba" });
marimba.start({ note: 60, velocity: 80 });
import { DrumMachine } from "smplr";
const context = new AudioContext();
const dm = new DrumMachine(context);
dm.start({ note: "kick" });
import { SplendidGrandPiano, Reverb } from "smplr";
const context = new AudioContext();
const piano = new SplendidGrandPiano(context);
piano.output.addEffect("reverb", new Reverb(context), 0.2);
piano.start({ note: "C4" });
See demo: https://danigb.github.io/smplr/
smplr is still under development and features are considered unstable until v 1.0
Read CHANGELOG for changes.
Library goals
- No setup: specifically, all samples are online, so no need for a server.
- Easy to use: everything should be intuitive for non-experienced developers
- Decent sounding: uses high quality open source samples. For better or worse, it is sample based 🤷
Setup
You can install the library with a package manager or use it directly by importing from the browser.
Samples are stored at https://github.com/smpldsnds and there is no need to download them. Kudos to all samplerist 🙌
Using a package manger
Use npm or your favourite package manager to install the library to use it in your project:
npm i smplr
Usage from the browser
You can import directly from the browser. For example:
<html>
<body>
<button id="btn">play</button>
</body>
<script type="module">
import { SplendidGrandPiano } from "https://unpkg.com/smplr/dist/index.mjs"; // needs to be a url
const context = new AudioContext(); // create the audio context
const marimba = new SplendidGrandPiano(context); // create and load the instrument
document.getElementById("btn").onclick = () => {
context.resume(); // enable audio context after a user interaction
marimba.start({ note: 60, velocity: 80 }); // play the note
};
</script>
</html>
The package needs to be serve as a url from a service like unpkg or similar.
Documentation
Create and load an instrument
All instruments follows the same pattern: new Instrument(context, options). For example:
import { SplendidGrandPiano, Soundfont } from "smplr";
const context = new AudioContext();
const piano = new SplendidGrandPiano(context, { decayTime: 0.5 });
const marimba = new Soundfont(context, { instrument: "marimba" });
Wait for audio loading
You can start playing notes as soon as one audio is loaded. But if you want to wait for all of them, you can use the load property that returns a promise:
piano.load.then(() => {
// now the piano is fully loaded
});
Since the promise returns the instrument instance, you can create and wait in a single line:
const piano = await new SplendidGrandPiano(context).load;
⚠️ In versions lower than 0.8.0 a loaded() function was exposed instead.
Load progress
Track how many samples have loaded via the onLoadProgress option or the loadProgress getter:
const piano = new SplendidGrandPiano(context, {
onLoadProgress: ({ loaded, total }) => {
console.log(`${loaded} / ${total} samples loaded`);
},
});
// Or poll at any time:
console.log(piano.loadProgress); // { loaded: 12, total: 48 }
total is known before loading starts, so you can display a determinate progress bar.
Shared configuration options
All instruments share some configuration options that are passed as second argument of the constructor. As it name implies, all fields are optional:
volume: A number from 0 to 127 representing the instrument global volume. 100 by defaultdestination: AnAudioNodethat is the output of the instrument.AudioContext.destinationis used by defaultvolumeToGain: a function to convert the volume to gain. It uses MIDI standard as default.disableScheduler: disable internal scheduler.falseby default.scheduleLookaheadMs: the lookahead of the scheduler. If the start time of the note is less than current time plus this lookahead time, the note will be started. 200ms by default.scheduleIntervalMs: the interval of the scheduler. 50ms by default.onLoadProgress: a function called after each sample buffer is decoded. Receives{ loaded, total }wheretotalis the full count known before loading starts.onStart: a function that is called when starting a note. It receives the note started as parameter. Bear in mind that the time this function is called is not precise, and it's determined by lookahead.onEnded: a function that is called when the note ends. It receives the started note as parameter.
Usage with standardized-audio-context
This package should be compatible with standardized-audio-context:
import { AudioContext } from "standardized-audio-context";
const context = new AudioContext();
const piano = new SplendidGrandPiano(context);
However, if you are using Typescript, you might need to "force cast" the types:
import { Soundfont } from "smplr";
import { AudioContext as StandardizedAudioContext } from "standardized-audio-context";
const context = new StandardizedAudioContext() as unknown as AudioContext;
const marimba = new Soundfont(context, { instrument: "marimba" });
In case you need to use the Reverb module (or any other module that needs AudioWorkletNode) you need to enforce to use the one from standardized-audio-context package. Here is how:
import {
AudioWorkletNode,
IAudioContext,
AudioContext as StandardizedAudioContext,
} from "standardized-audio-context";
window.AudioWorkletNode = AudioWorkletNode as any;
const context = new StandardizedAudioContext() as unknown AudioContext;
// ... rest of the code
You can read more about this issue here
Play
Start and stop notes
The start function accepts a bunch of options:
piano.start({ note: "C4", velocity: 80, time: 5, duration: 1 });
The velocity is a number between 0 and 127 the represents at which velocity the key is pressed. The bigger the number, louder the sound. But velocity not only controls the loudness. In some instruments, it also affects the timbre.
The start function returns a stop function for the given note:
const stopNote = piano.start({ note: 60 });
stopNote({ time: 10 });
Bear in mind that you may need to call context.resume() before playing a note
Instruments have a global stop function that can be used to stop all notes:
// This will stop all notes
piano.stop();
Or stop the specified one:
// This will stop C4 note
piano.stop(60);
Schedule notes
You can schedule notes using time and duration properties. Both are measured in seconds. Time is the number of seconds since the AudioContext was created, like in audioContext.currentTime
For example, next example plays a C major arpeggio, one note per second:
const now = context.currentTime;
["C4", "E4", "G4", "C5"].forEach((note, i) => {
piano.start({ note, time: now + i, duration: 0.5 });
});
Looping
You can loop a note by using loop, loopStart and loopEnd:
const sampler = new Sampler(audioContext, { duh: "duh-duh-ah.mp3" });
sampler.start({
note: "duh"
loop: true
loopStart: 1.0,
loopEnd: 9.0,
});
If loop is true but loopStart or loopEnd are not specified, 0 and total duration will be used by default, respectively.
Change volume
Instrument output attribute represents the main output of the instrument. output.setVolume method accepts a number where 0 means no volume, and 127 is max volume without amplification:
piano.output.setVolume(80);
⚠️ volume is global to the instrument, but velocity is specific for each note.
Events
Two events are supported onStart and onEnded. Both callbacks will receive as parameter started note.
Events can be configured globally:
const context = new AudioContext();
const sampler = new Sample(context, {
onStart: (note) => {
console.log(note.time, context.currentTime);
},
});
or per note basis:
piano.start({
note: "C4",
duration: 1,
onEnded: () => {
// will be called after 1 second
},
});
Global callbacks will be invoked regardless of whether local events are defined.
⚠️ The invocation time of onStart is not exact. It triggers slightly before the actual start time and is influenced by the scheduleLookaheadMs parameter.
Effects
Reverb
An packed version of DattorroReverbNode algorithmic reverb is included.
Use output.addEffect(name, effect, mix) to connect an effect using a send bus:
import { Reverb, SplendidGrandPiano } from "smplr";
const reverb = new Reverb(context);
const piano = new SplendidGrandPiano(context, { volume });
piano.output.addEffect("reverb", reverb, 0.2);
To change the mix level, use output.sendEffect(name, mix):
piano.output.sendEffect("reverb", 0.5);
Experimental features
Cache requests
If you use default samples, they are stored at github pages. Github rate limits the number of requests per second. That could be a problem, specially if you're using a development environment with hot reload (like most React frameworks).
If you want to cache samples o
