AudioContext, noteOn/noteOff, currentTime and time units

The Web Audio API is a high-level JavaScript API for processing sounds in web applications. The specification was released in 2012, and is supported in Chrome and Safari on Mac OSX, and in iOS6. Implementations of the standard I’ve tested are stable and responsive.

Beosystem 2000

AudioContext is very new, and many examples on the net illustrate the basics of starting and stopping a sound. The methods that start and stop sounds are called noteOn and noteOff. To start or stop a sound immediately you can call




The argument of 0 means “immediately” but does not correspond to the audio engine’s notion of “now.” Indeed, I first made the mistake of thinking that that the current audio time was relative to the execution time of the script: it is not – audio time is a property of the AudioContext.

To be completely explicit, I now write all mentions of sound triggers using the currentTime of a global audioContext.

    var now = audioContext.currentTime;
    sound.noteOn(now + 0);


    var now audioContext.currentTime;
    sound.noteOff(now + 0.5);


An AudioContext is an object into which sounds can be rendered, much in the way graphics can be rendered into a Canvas. The audio API lets you create a graph of audio sources and effects. Audio sources can be waveforms, audio files, and primitive oscillators that create pure tones. Oscillators can be started and stopped with the noteOn and noteOff methods. A simple example for playing an concert-A appears below.

    var aContext = new AudioContext();

     * Play an audio tone
    osc = aContext.createOscillator();
    volume = aContext.createGainNode();

    osc.type = 0;
    osc.frequency.value = 440;

    volume.gain.value = 0.3;


In this example, a new audio context is created with two nodes: an oscillator and a gain element for adjusting the volume. The oscillator is turned on immediately and turned off a half-second later, just as you would expect. The reason it works correctly, however, is not obvious. The time units are in real-time, relative to the creation time of the AudioContext.

Time Units

Each AudioContext defines a one-dimensional coordinate space that is initialized to 0 when the context is created. It increases in real-time in the units of seconds. The value is volatile and corresponds to a hardware timestamp. It cannot be paused or stopped. Different AudioContexts define different coordinate spaces.

The time arguments to noteOn and noteOff are interpreted as absolute times in the coordinate space of their AudioContext.

Here is a good way to trigger an oscillator sound with precise duration in a button click handler.

    function makeTestAudioButton(aContext) {
        var btn = document.createElement('input');
        btn.value = "Beep";
        btn.type = 'button';
        btn.onclick = function ()  {

            var osc = aContext.createOscillator();
            var volume = aContext.createGainNode();

            osc.type = 0; // sine wave
            osc.frequency.value = 440;

            volume.gain.value = 0.3;

            var now = aContext.currentTime;
            osc.noteOn(now + 0);
            osc.noteOff(now + 0.5);

    function main() {
        var aContext = new AudioContext();


I now avoid the use of “0” as an argument to noteOn and noteOff. While it is a shorthand that means “now”, its use has the side-effect of not clearly stating which Audio Context timeline is active. Instead, I now reference currentTime explicitly and schedule events with deliberation. For me, specificity wins over brevity.


Leave a Reply

Your email address will not be published. Required fields are marked *