Mark Wheeler worked on a series of visual sound experiments by creating openFrameworks apps that you can ‘play’ visually, generating animations from MIDI data.
Experiments 1 & 2 were created by Mark, using live MIDI notes, BPM and CC values to create visuals with a ‘tightness’ to the music that wouldn’t be possible with only audio data. Each musical note is tied to a visual, audio effects have visual counterparts and transitions happen in time with what’s being played.
The third experiment came about after Russ Chimes suggested a collaboration based around his track We Need Nothing to Collide. Clay Weishaar also came on board and helped take the visuals out into the real world. The setup team used included a 5000 lumen projector running from a car via an inverter, and shot with a 5D.
“At first we planned on shooting at more wild, natural locations. However, after doing a test shoot in suburbia we realised there was something quite magical about the projections transforming these more mundane settings. Of course, it’s also fun watching the reactions from passersby (or, sometimes, their ability to ignore huge projections).” – Mark Wheeler
Everything is driven by custom openFrameworks apps linked to an Ableton Live set and MIDI controllers. A Monome running Mark Eats Sequencer is also used in some of the experiments you see below.