APIs interact with code using one or more JavaScript objects, which serve as containers for data used by the API (contained in object properties), and functionality made available by the API (contained in methods of object).

Note : If you are not already familiar with how objects work, you should go back and read through the JavaScript objects module before continuing.

Let’s take the Web Audio API as an example. This is a fairly complex API with multiple objects. Here are the main objects:

  • AudioContextwhich represents an audio graph that can be used to manipulate audio playback in the browser and has various methods and properties that are available to manipulate that audio signal.
  • MediaElementAudioSourceNode (en-US), which represents an element <audio> containing sound that we want to play and manipulate in context.
  • AudioDestinationNode (en-US), which represents the destination of the audio, i.e. the physical component that will be used to produce the sound (this is usually the speakers or the headphones).

So how do these objects interact? If you look at our example audio element (also watch it live), you will see the following code:

<audio src="outfoxing.mp3"></audio>

<button class="paused">Play</button>
<br>
<input type="range" min="0" max="1" step="0.01" value="1" class="volume">

We include, first, an element <audio> with which we embed an MP3 in the page. We do not include default browser controls. Then we include a <button> which we will use to play and stop the music, and an element <input> of type range, which we will use to adjust the volume of the track currently playing.

Next, let’s look at the JavaScript for this example.

We start by creating an instance AudioContext inside which we are going to manipulate our track:

const AudioContext = window.AudioContext || window.webkitAudioContext;
const audioCtx = new AudioContext();

Next, we create constants that store references to our elements <audio>, <button> and <input>and we use the method AudioContext.createMediaElementSource() to create a MediaElementAudioSourceNode representing the source of our audio — the element <audio> will be played from:

const audioElement = document.querySelector('audio');
const playBtn = document.querySelector('button');
const volumeSlider = document.querySelector('.volume');

const audioSource = audioCtx.createMediaElementSource(audioElement);

Next, we include two event handlers that are used to toggle between play and pause when the button is pressed and reset the display to the beginning when the song is over:


playBtn.addEventListener('click', function() {
    
    if (audioCtx.state === 'suspended') {
        audioCtx.resume();
    }

  
    if (this.getAttribute('class') === 'paused') {
        audioElement.play();
        this.setAttribute('class', 'playing');
        this.textContent = 'Pause'
    
} else if (this.getAttribute('class') === 'playing') {
        audioElement.pause();
        this.setAttribute('class', 'paused');
        this.textContent = 'Lire';
    }
});


audioElement.addEventListener('ended', function() {
    playBtn.setAttribute('class', 'paused');
    playBtn.textContent = 'Lire';
});

Note : Some of you may have noticed that the methods play() and pause() used to play and pause the track are not part of the Web Audio API; they are part of the API HTMLMediaElement. which is different but closely related.

Then we create an object GainNode using the method AudioContext.createGain()which can be used to adjust the volume of the audio passing through it, and we create another event handler that changes the gain (volume) value of the audio graph when the slider value is changed:

const gainNode = audioCtx.createGain();

volumeSlider.addEventListener('input', function() {
    gainNode.gain.value = this.value;
});

The last thing to do for this to work is to connect the different nodes of the audio graph, which is done using the AudioNode.connect() (en-US) method available on each node type:

audioSource.connect(gainNode).connect(audioCtx.destination);

The audio starts in the source, which is then connected to the gain node so that the volume of the audio can be adjusted. The gain node is then connected to the destination node so that the sound can be played on your computer (the AudioContext.destination (en-US) property represents what is the default AudioDestinationNode (en-US) available on the hardware of your computer, for example your speakers).

Leave a Reply