Live Music Visualiser: Groove Salad

This project is created in p5.js and you can view the desktop beta version right here now.

 

I’m still ironing out some glitches in the mobile version but you can see the latest mobile version here.

 

Other than a handful of birds of prey we have the best eyesight in the animal kingdom. It’s our sight which is largely what we have evolved around. Even if we’re not specifying prey on the savannah with sight and spear, we’re specifying a box of Coco Pops from a box of Weetabix in the supermarket (vital) — sight plays a big part in our life experience.

 

But what about sound? Does this take a back seat to sight, or is it just as important in sculpting our world? Or is it precisely the fact that our eyesight is our dominant strength that our sense of sound supplies a largely unconscious understanding of the world; acting intuitively and instantly; vital but unnoticed.

 

What if we illuminated sound, gave it texture or motion, colour or form? Would the priority change? Would we ‘see’ things differently? For the visually impaired, this is already a truth.

 

The purpose of this project is to take a deep look at the role sound plays in ordinary working lives by showcasing it with focus.

 

The current code is available here in this GitHub Repo for any suggestions.

 

Here’s the journey thus far…

I started with this cassette. A cassette which was originally a gift from my grandfather.

 

The idea was originally brought about by a story analogous to the gift. After a life spent in London Jazz clubs – singing in some of them, my grandfather was a true fan of the craft. With the move into old age and slowly diminishing hearing, he sought to systematically go through his entire record collection, listening to the best of the best before his hearing went entirely and so the world of music lost to him.

 

It’s this story that the cassette tape has become emblematic of and this story which ended up being the inspiration for the project, which was, how music informs our world experience, and how music tells us who we are.

 

I went about this trying to make this idea literal, having music and sound feedback a real visual manifestation of the world but blended with music to essentially manipulate our understanding of what the world “is”.

The evolution of this project took many directions before settling on the final outcome.

 

Beginning with coding a camera pixel generator – converting live video feed into a low-resolution line of rectangles and then coding sound input to manipulate the height of the pixels evermore – first with sound level input and then with EQ response for a more expressive visualisation of sound.

Here’s the refined code for the main elements you see and interact with – not showing setup and other functions which require ungodly amounts of screen space I’m not willing to sacrifice for a simple project page 😉

function draw() {

 

//NOISE FILTER//

 

let freq = map(mouseY, 0, 380, 380, 15000);
freq = constrain(freq, 380, 22000);
filter.freq(freq);
filter.res(50);

 

//BACKGROUND//

 

let eq = fft.analyze()

 

Lows = (int)(fft.getEnergy(“bass”));
LowMids = (int)(fft.getEnergy(“lowMid”));
Mids = (int)(fft.getEnergy(“mid”));
HighMids = (int)(fft.getEnergy(“highMid”));
Highs = (int)(fft.getEnergy(“treble”));

 

background([Lows + LowMids]/2, HighMids + Highs, Mids + slider3.value());

 

//VIDEO AND EQ PIXEL LOAD//

 

translate(slider5.value()*50,slider5.value()*50);

 

capture.loadPixels();
spectrum = fft.analyze();
noStroke();
fill([HighMids + Highs]*1.5, slider2.value() + Mids, [Lows + LowMids]/2,70*[slider1.value()+1]/1.5);

 

for (var cy = 0; cy < capture.height; cy += 13) {
for (var cx = 0; cx < capture.width; cx += 1+slider1.value()) {
var offset = ((cy * capture.width) + cx) * 4;
var xpos = (cx / capture.width) * width;
var ypos = (cy / capture.height) * height;

 

rect(xpos, ypos, 6-slider1.value(), spectrum[cy] * (capture.pixels[offset + 1] / 225));

 

//TRANSFORM//

 

rotate(PI / 180 * slider5.value());

 

}
}
}