For my final project, I wanted to find a way to combine many of the field recordings I had made in addition to other, much longer sound recordings and experiments I had done over the semester into one final piece. These recordings consisted of sounds in my studio (an hour long recording of the radiator banging), moments from a Brief History of Time (mostly recordings of Steven Hawking), recordings from the Bayernhof Museum, and live and altered recordings from the robot project I was a part of (recorded afterwards and independent of the scope of the initial project). I had been playing with these sounds for several weeks in Ableton Live and had become interested in the mood and message created when I combined them together.
While I had initially planned to build another robot to play along with the recordings -(triggered by motion or movement) and held within several sculptural containers, once I started adding the recordings into Ableton Live and setting them to the Alias 8 controller, I realized I wanted to be able to have more control over the sounds. After going over some of the past posts Jesse made related to the Kinect, I recalled that when I first came here, one of the initial projects I had wanted to create was a body controlled sound piece. I had done some explorations with the Kinect before, but hadn’t used it with Max and was excited about the possibilities with Ableton Live through using Synapse. Unfortunately, Synapse can only be ran with the first generation of Kinect (which I thought I had) – so the computer in the sound lab was showing missing areas in Synapse when I tried to run it. The program that offered another way to use Synapse was this one and seemed fairly easy to get up and running:
. Unfortunately, I had issues with that one as well.
Shifting gears, I instead used the dp.kinect external that I had already installed (and reworked the patch). I then added some of the new vizzie effects in Max 7 along with the subpatch that allows changes in amplitude to alter both the delay and zoom of the visuals. After much trial and error, I set the parameters of the kinect so that the delay allowed for a jump in time and place of the captured body to correspond with the mood and message of the samples and effects I had created in Ableton Live.
Version of the final performance/installation:
Hopefully I can replace that with a better version tomorrow.