I performed a 5-minute audio-visual piece at CultureHub for an audience of about 50 people. Here is a recording of it – the piece is called Big Dark Age.
This piece was set up and executed live using Max/MSP Jitter – a visual programming language for music and multimedia.
As you can see, the venue had three projectors provided three distinct planes that could be blended together, or used in conjunction.
Using Max, I controlled what was shown through three main elements of the piece:
- The colors planes that tint all videos
- The content being displayed in the middle
- The playback speed of the two side videos
The piece was made in the Live Image Processing & Performance class, a big thank you to Prof. Matt Romein and all my classmates for a great course.
A large part of this piece and my work in general is COLOR. In this piece I swapped the RGB(A) color planes of the side videos and main video. In other words, I turned red into green and green into blue, etcetera, etcetera.
The colors were controlled with just a few clicks, built into the center of this home-made Max patch controller, which I created and used during the performance:
As you can see, there are many other buttons to control various features of the playback, including things like “Color Uzi” that do a splattering a plane re-mapping at once like this:
…but why a crazy robot warrior?
Main Screen Content
The content played on the main screen was from a selection of 12 videos that I ripped, remixed, and/or created. The content was targeted towards an interpretation of the aesthetic and lyrics of song – Little Dark Age by MGMT.
The videos were designed to resonate with the themes of darkness, escapism, and degradation in the music. They were picked to make the audience consider our efforts and compromises – and how we justify our intentions over our actions.
In less-lofty words: videos of escape attempts, surveillance, and military action with a heavy memento mori.
The lyrics and sounds of the song directly inspired me to make the piece. I was happy to hear this was true to the MGMT’s intent, as described in this behind-the-scenes:
While the main message was carried through the center video, the side videos were looped at various speeds to capture the energy of the song in another way.
Video Playback Speed
Although its not apparent from the video, I used an external device to control the playback speed of the side videos.
For example, here is a side-video playing at 1x speed on the top, and 2x speed on the bottom:
The device I used to control this speed was this boat lever!
Pulling the level to ‘full-stop’ like it is picture above reduced the video playback to 0. Meanwhile, bringing it fully forward doubled the playback speed. Here is a small clip of me ripping it forward.
The lever has a small potentiometer inside of it, which is connected to an Arduino, which was then connected to the computer and Max patch.
Here is video of this big crank lighting up a small LED light, from when I was testing it:
I created the housing and setup for the performance in a few hours, using a few random screws, some wood scrap, and a pocket drill jig. Shout out to Ben Light’s fabrication class for the knowledge. It held up great.
Beyond the nice presentation setup, the Max patch is a behemoth, but somewhat organized:
You are welcome to download it, along with the simple Arduino code used in conjunction. In the end, controlling these three features was really all there was to the 3-screen presentation. The performance was a fantastic experience. Thanks again to everyone who was involved with the show and to my lovely audience.