Project Date: Spring 2014

Technologies: Processing, MIDI, Arduino, Ableton Live

My final semester at RIT was consumed by New Media Team Project. I served as the development lead on a team of four developers and three designers tasked with making something… awesome. Our group quickly meshed on a general idea: create an interactive experience with generative audio and retro visuals, powered by your brain! After 15 long weeks the result was EEGJ, or “Electroencephalogram Jockey,” which was a hit at Imagine RIT 2014.

EEGJ from Whitney Brown on Vimeo.

The exhibit used a hacked MindFlex toy as the primary input and three projectors powered our display. Users sat in front of the display and donned the MindFlex, which provided a rough reading of their brain waves. This brain data was used to control a Guitar Hero-esque game and determine the tone, tempo, and instrumentation of a generative music track. Pressure sensors built into the chair were used to countrol audio filters and visual “turntables”. The game took up the center display, and the side displays included other visuals that beat, pulsed, and moved in time with the tempo of the music.

EEGJ - Imagine R.I.T. from Antwan Russell on Vimeo.

My main contribution to the project was the “Riri Framework”. Named for our project’s early title (“Riri,” a nonsense word), the Riri Framework was a Processing library that allows you to create pre-programmed or generative sequences of MIDI data. The framework consisted of RiriNotes (individual MIDI notes) and RiriChords (groups of notes to be played simultaneously) that could be strung together into RiriSequences and played indefinitely. Using another Processing libary called The MidiBus, you can send the MIDI data generated with the Riri Framework to any other program (in our case Ableton Live) to be turned into sound using soundbanks or sofware instruments.

In addition to the Riri Framework, I was responsible for organizing and distributing work amongst the other developers. I also did early experiments with MIDI for the sound and Arduino for the various sensors, as well as assembled all of the individual components into the final Processing sketch used at the exhibit.

View the project on GitHub!