Create HTML5 audio visualisations

Blurring the line between Flash and HTML, Nick Jonas gives you a jump-start into audio-driven animation that runs natively in the browser

This tutorial is designed to help you ease your way into the powerful new Chrome Web Audio API, a high-level JavaScript API for processing and synthesising audio in web applications. We will create an app that will loop through a set of provided audio tracks, and draw an audio visualisation to the HTML5 canvas tag. The canvas animations will make use of Paper.js, a powerful graphics library that provides a clean and easy-to-use interface.

The goal here is to make beginner and advanced-level developers familiar with the basics of the API, and inspire you to explore the new possibilities of both native audio and animation in the browser. For further reading, here are some resources: Complete Web Audio Specification; Paper.js; DAT-GUI; and Chromium Project.

01

01 First create a blank HTML file and, in the <head> tags, link to a blank JavaScript file (main.js in the support files) and the three provided JavaScript libraries: Paper.js (for animating to canvas), DAT.GUI (for controls) and Buffer Loader (for loading multiple sound files).

02

02 To set up the audio, first create an instance of AudioContext with the call new webkitAudioContext(). This represents a set of AudioNode objects and their connections. Then, create an array of paths to the provided audio files that will be loaded, and an empty array that will store the loaded audio tracks (the music used here is created by dubstep artist Jordan Richardson).

03

03 To load individual sounds, an XMLHttpRequest object is used. To make things easy, we’ll make use of the generic BufferLoader class here (in your js\libs\ folder) to load multiple sound files and fire a callback when the sounds have loaded so that we can start them at the same time. Please note, at this point we need to be running on a local or remote server (MAMP for OSX users, for example).

04

04 For each track, we will create an AudioBuffer object and an AudioGainNode object for adjusting the gain/volume of each track. To play each sound from the start we will call noteOn(0); on each AudioBuffer object. Finally, we need to set an interval to trigger the ring animation every 215 milliseconds to match the beat of the tracks (140 BPM).

05

05 Our animation starts with a method createRings() called by our interval, which will loop through each track. In this method, we’ll grab the volume of the track, and then set up a basic algorithm for creating a ring to animate. If the track’s volume is greater than zero, we’ll animate a ring out from the circle. We’ll base the frequency (215ms maximum) and the opacity of the ring on the volume level.