Making an Audio Scope with P5.js
This is a quick write-up to share with y’all a small project I’ve been working on using P5.js and Web Audio to implement some audio visualizations. By the end, we’ll have something like this:
Embedding an Audio File
HTML has the ability to embed audio in a page with the
<audio>
tag. This one declares a single MP3 file as a source.
In this form, the <audio>
element doesn’t do anything except declare some
audio that can be played. It’s invisible and the user can’t interact with it or
control playback. That fine because because I’m going to implement my own
playback control as part of my sketch below.
Processing Audio with Web Audio
Web Audio uses a node-based paradigm to process audio. Audio flows from source nodes, through a web of interconnected processing nodes, and out through destination nodes.
Sources can be <audio>
tags or realtime waveform generators; processing nodes
might be filters, gain adjustments, or more complex effects like reverb; and
destinations could be your computer’s speakers or a file.
Here’s the entire code snippet that sets up the audio processing I need for the sketch:
The AudioContext
is the object that encapsulates the
entire node graph. On line 10, I create a new AudioContext
.
On line 11, I create a MediaElementSourceNode
with the <audio>
element I declared on this page.
Next, line 13 creates an AnalyzerNode. Analyzer nodes don’t affect the audio that flows through them. Instead, this node gives the sketch access to the raw audio samples as they’re passing through the AudioContext. We’ll use this to plot the waveform as the audio is playing!
Line 15 hooks up the nodes in the graph. We connect the output of the source
node to the input of the analyzer node, and the output of the analyzer node to
the audio context’s destination
node that routes to the computer’s speakers.
Our audio processing graph looks like this:
By itself the AudioContext doesn’t actually play any audio. I’ll tackle that next.
Playing Audio
Next up is starting playback. The following snippet creates a Play button using
P5.js’s DOM manipulation API, and hooks up the button’s click
event to start
and stop playback.
Something I found odd while working with these audio components is there isn’t a way to ask any of them if audio is playing back at any given moment. Instead it is up to the script to listen for the appropriate events and track playback state itself.
If this snippet looks a little convoluted, that’s why.
To track playback status, I decided to set a playing
property on the button’s
dataset
indicating whether to call audioElement.play()
or
audioElement.pause()
and to set the label of the button appropriately.
The last bit of playback state tracking to do is to listen for when playback
ends because it reached the end of the audio file. I did that with the ended
event:
This handler resets the playing
flag and the label of the button.
The Sketch
Now it’s time to draw some waveforms! The main part of a P5 sketch is the draw
method. Here’s mine:
The most interesting part of this function starts at line 66 where we get an array of samples from the analyzer node. The samples
variable is a JavaScript Float32Array
, with one element for each pixel of width.
Once the sample data is populated from the analyzer, we can render them by plotting them along the X axis, scaling them to the height of the sketch.
I also manipulate the weight (size) of the point and its color by interpolating sizes and colors based on the value of the sample.