Category Archives: Projects

Coral

Developed by Luke Hottinger and Chris Williams

Coral is a wooden, electronic, audio game for the visually impaired. Housed within a laser-engraved wooden enclosure, the electronics of this toy produce tonal sequences to be matched. A player’s responses are identified by toy rotation in four directions. The highly textured exterior and the stylized marker provide tactile orientation cues. Game states are distinguished by families of distinctive auditory cues. This project explored diverse disciplines: sound design, product design, interaction design, user interface design, and architecture.

Full documentation is also available here.

coral_box_front

Roles:

Luke Hottinger

  • Programming
  • Research
  • Documentation
  • Assembly/Electronics

Chris Williams

  • Game Mechanic Design
  • Sound/Visual Design
  • Research/Documentation
  • Assembly/Fabrication

Implementation:
Product Design:
The initial motivation for Coral was exploration of alternative perspectives, as well as collaboration.

The primary and secondary interaction modes of Coral are auditory and tactile, respectively. The project emerged from a desire to work with the visually impaired. Within that culture, sound and touch are often used in daily interaction with the world.

The form of Coral was simplified to a cube. A cube has discrete surfaces and allowed for easier mapping of tones to surfaces. Historically, the wooden block is one of the oldest and simplest toys for young children.

Coral was designed with intended use by people age seven and older. As a result, the design was lightweight.

The enclosure is 64 cu. in. (or 4 in. X 4 in. X 4 in.). The electronics were optimized to fit in a space of 27 cubic inches (or 3 in. X 3 in. X 3 in.).

coral_box_front

coral_box_back

Sensors:

Coral used a number of inputs and sensors to help the user navigate through the menu as well as play the game. Our project used three main components: an Arduino Uno, an Invensense MPU-6050 Accelerometer, and an NRF-8001 Bluetooth Low Energy (BLE) module. The accelerometer transferred data to the Arduino using the I2C protocol while the BLE module communicated with the Arduino using the SPI protocol. Other components used include: a piezo speaker for playing tones and a 1000 mAh LiPo battery.

Accelerometer:
The MPU-6050 is a 6-axis gyroscope/accelerometer sensor that was the basis of our gesture sensing. The sensor records both the current direction of gravity as well as the rotational speed and position of our box. These gestures were recognized through a state-based sensing mechanism. In this, each side of our cube was given a state number that represented if it was the active side (facing upward). If the cube detects that a state change was made, it begins to record the string of states after it. In 1 player mode, this string of state changes is then compared to the provided string of states randomly chosen by the program to see whether the user entered the correct pattern.

BLE Module:

The NRF-8001 is a UART based BLE module that allows for serialized data to be transmitted between the chip and another BLE enabled device. In our project, this was intended to transmit gameplay data between an iOS device and our cube for a 2 player mode. However, it served as a means of debugging and game servicing. We were able to send commands from an iOS app to the cube to test various functions in the gameplay program.

Arduino:

The Arduino Uno microcontroller acted as the central processing unit for our project, running the gameplay program as well as managing all of the connections and gestures entered. It handled a various number or protocols, such as I2C and SPI to connect to the various sensors and components. The Arduino also served as the project’s power regulator, taking in 12V and stepping it down to a component friendly 3.3V and 5V.

LiPo Battery:

The battery used was a 1000mAh, 3 Cell, 12V HobbyKing battery. This battery had plenty of power to keep our project running for upwards of 15 hours of continuous use.

Piezo Speaker:

The piezo we used had a frequency response range of 30 Hz to 15,000 Hz.

Wiring Diagram:

coral_wiring_diagram

Material:

The enclosure is made of laser-etched and laser-cut poplar wood. Wood was chosen for its organic properties, such as its general texture and relative malleability.

Poplar was chosen due to its softness and straight, uniform grain. The uniform grain allowed for greater resolution of texture into the surface.

poplar

Interface:

Four sides of the enclosure are used during gameplay. Two sides are used for utility: the top and bottom surfaces. The top surface has a shell design that serves as a tactile “home” cue and houses the power switch. The bottom surface has a concentric pattern that led to a proposed, centrally-located USB power jack.

To readily identify the enclosure’s walls, each exterior surface was given a unique texture. The textures were modeled after the organism coral. As a species, coral has diverse textures. The use of coral provided a unifying visual and tactile motif. Similarly, the shell atop the “Home” square is modeled after a gastropod called a “limpet”.

Gameplay encouraged grasping the cube with two hands. The use of both hands insures secure rotation of Coral.

Sound Design and User Experience:

The Neutral state is defined as the Home square facing upward and the tapered side of the shell pointing at the player.

Tones:

Coral has families of codified tones. They are categorized as follows:

A. Utility:

  1. Welcome: A set of three tones indicating that the device was first powered on.
  2. Menu Confirmation: In Utility Mode, a tone indicating that a player’s gesture was received.
  3. Orientation: When the Home square is upside-down, a tone encouraging the device to be returned to the Neutral state.

B. Gameplay:

  1. In-Game: Four tones total. One tone is mapped to one side of the device.
  2. Acknowledgement:
    a. Positive Tone: During Gameplay mode, the tone indicating that a tonal sequence was matched.b. Negative Tone: During Gameplay mode, the tone indicating that a tonal sequence was unmatched.

All tones were distinct. The Utility Tones were differentiable from the Gameplay Tones. Players can understand whether they are in the menu or in the game based on sound.

Similar to the game Simon, Coral has four In-Game Tones, which are loosely based on the F Major chord. These notes are:

F4: 349 Hz
A4: 440 Hz
C5: 523 Hz
F5: 698 Hz

The F notes differed by an octave.

Wherein the tones of Simon are reminiscent of trumpet fanfare, the In-Game Tones of Coral were inspired by an instrument called the “handpan“. All Coral tones were in consonance, with the exception of the Negative Tone.

Example of User Experience (UX):

Player turns on device

Welcome Tone plays

Player selects Game Mode

Game Modes:

1 Player:

Tilt device towards self and return to Neutral State

Play confirmation tone

2 Players:

Tilt device away from self and return to Neutral State

Play confirmation tone

Game Mode: One Player

Play In-Game Tone:

If matched, then play Positive Tone. Then, device plays a different In-Game Tone plus one more In-Game Tone.

If unmatched, then play Negative Tone. Then, game over. Return to Game Mode Selection.

Play two In-Game Tones:

If matched, then play Positive Tone. Then, device plays two different In-Game Tones plus one more In-Game Tone.

If unmatched, then play Negative Tone. Then, game over. Return to Game Mode Selection.

Name:

The name Coral was chosen for numerous reasons. There is an obvious reference to the visual motif. The more subtle aspect is that the name can double as a person’s name. This feature alludes to the history of games, like Simon and Henry.

Discussion:

Humans are visually-oriented. This preference is seen in such forms as movies, books, magazines, and live theater. There are numerous facets of life that demonstrate a bias toward the sighted, such as advertising. Focusing on underexplored aspects can yielded new and surprising results.

There are numerous technologies for the blind. However, they tend to be utilitarian, such as the walking stick. Quality of life includes ways of enjoying life, such as entertainment.

The use of wood in Coral refers to the tradition of wooden toys that spans centuries and diverse cultures, such as Ancient Egyptian. There are designers of contemporary wooden toys, such as Playsam.

Some lessons learned:

Challenge your assumptions:

There are many assumptions that sighted people make in their interactions with the world, such as using landmarks for providing directions visually. When visual cues are inaccessible, a new mode of communication needs to be established. In developing Coral, consideration was taken in making components accessible tactilely, such as the power switch.

Many sighted people were drawn to the textures of Coral. Their visual study of the forms often led to touching and exploring the cube.

Prototypes:

The first prototype was made of cardboard, which was useful in understanding the basic interaction with the cube. The second prototype was made of Open Beam and acrylic, which helped in understanding weight and robustness. The third prototype emerged from tests of wood and textures.

coral_cardboard_prototype

coral_open_beam_prototype

prototype_wood-1

Differences in Visual Texture and Physical Texture:

Laser etching in wood can produce textures that are visually distinctive. However, the textures feel smooth and indistinguishable to the touch. Methods were developed to produce physically distinctive textures for Coral. As a by-product, the visual textures were more striking.

coral_textures

The blind painter John Bramblitt discovered the ability to paint by identifying differences in the textures of color paints. He used “haptic visualization” as a means for seeing based on touch. Similar ideas were useful in developing the textures. Players can “see” their location on the cube by touch.

Happy Accidents:

For our team, new discoveries were made with the laser engraver. We discovered that a laser engraver was capable of such varied, dimensional textures.

Due to the laser engraving method, the wood also developed the appearance of a glazed finish. The laser engraving also created a warm wood scent that lasted for days after engraving.

coral_stipple_texture_detail-1

Conclusion:

Coral introduces a new form of entertainment into an underserved community as well as a new perspective on gameplay to an existing larger group. Coral also occupies a novel niche of wooden electronics. It is more dynamic and interactive than traditional wooden toys. Similarly, it provides a unique tactile interface that is missing in modern audio games. The greater context is the exploration of traditional materials in a non-traditional context.

View code here.

Table of Unique Harmonic Tones

genome

Place objects on the table to make music.

Built with openFrameworks, genome uses a projector mapped to a table to display visuals, and a Kinect mapped to the table to record the positions of objects. Both mapping applications were custom-built. Tempo is kept by sending messages via OSC from Max/MSP to the OF App, and notes are triggered by sending messages from the OF App to Max.

Code here.

€!r€u!+ Br3@k3r$

“Circuit Breakers” created by Amber Jones, Amanda Marano, Caitlin Quinlan and Jack Taylor, is a piece that explores the modern representation of data with an attempt to break it. In our project, we explored the art of datamoshing, circuit bending and vocoding, all while projecting our visuals onto a pile of mirror balls to expand the area that visuals are typically confined in.

FINAL VIDEO:

 


ROLES

Circuit Bending – Amanda Marano

Datamoshing/Visuals – Jack Taylor

Composition – Caitlin Quinlan

Sound Design – Amber Jones


PROCESSES

Circuitbending 

To circuit-bend our children’s toy, we decided to modify it to do a basic pitch-bend, that is, create the ability to dynamically adjust the pitch and speed of the toy in real-time. In order to do that we first found the resistor that controlled the clock speed of the simple printed circuit board inside the toy. A clock in digital logic is a signal that is used to synchronize all of the signals sent throughout a digital circuit, so everything runs in the order it is supposed to and nothing breaks. In order to find this resistor, we opened the back of the toy and located all of the larger components connected with wires (in our case it was two resistors, a capacitor, and the speaker). By changing the clock, either increasing or decreasing the speed, we change the speed of the entire circuit, and with it the pitch of the sounds (low pitch for low speeds, high pitch for high speeds). While playing the instrument, we touched these leads with our fingers (a licked finger creates a short in the circuit) and noted how the sound quality changed with the short circuit. When we located the correct resistor, we snipped it out and soldered in its place a potentiometer. A potentiometer is a variable resistor that is modifiable in real time by turning the dial or knob on the top, either with your fingers or a screwdriver. After drilling a hole in the toy’s chassis and sticking the potentiometer through, we were able to change the speed and pitch of the sounds by spinning the knob while playing notes. During our performance, we used a microphone by the speaker, but didn’t otherwise alter the original sounds in any way. – Amanda

Video of how the circuitbent piano functions:

Datamoshing

For our final project it was my job to create the visuals using a databending technique called datamoshing. Datamoshing is a process that removes certain frames in a video which then allows it to glitch. For the visuals I wanted to play the idea of nature’s relationship to technology. I though that videos of flowers blooming would be interesting to datamosh and would contribute to our overarching theme of glitch. Basically I converted the videos into a different format using ffmpegx then imported those clips into a program called avidemux. With this program I was able to remove the I frames from the clips (these frames record the differences in movement from one frame to the next). I was also able to copy and past frames, resulting in explosions of colors and distorted movement. – Jack

ORIGINAL VIDEO:

Composition

For our project I was interested in using the vocoder. After Amber created the background music I composed a melody that could be sung with the vocoder over it. The toy piano had a kind of funny and sad tone to it when it was played so I paid homage to that in my lyrics and wrote:

“My circuits are bending

I can’t sing in tune

But maybe I can dance”

I imagined the vocoded voice as the voice of the toy piano; they are meant to express the “Frankenstein” like nature of the piano but in a sardonic way. These lyrics are repeated throughout the song. In the live performance I included the singing based on what other elements were involved. – Caitlin

Sound Design 

All of the post-processing effects and instruments used in our performance (besides the raw sound of the toy piano) were created using Ableton Live. Our original idea was to have a fun, upbeat, high-pitched and slightly overwhelming dance track, however, upon the bending of the toy piano, I realized that the sounds were completely different than intended. It turned out that the ‘broken’ sound of the toy piano was more apparent when its clock was slowed down. Paired along with the visuals as well, the song quickly changed from 160 bpm to 48bpm. The song itself was 7 tracks – 4 instrument tracks, a drumkit, the carrier for the vocoder track, and the vocoder. Every instrument rack/preset I made was with the ‘Analog’ instrument. I wanted the sounds to be very obviously and unapologetically digital,  but slightly detuned to make them seem only slightly more organic. All of the sounds were either made with 2 Saw/Square oscillators or Saw/Saw oscillators.

guitar-like chords patch:

Screen Shot 2015-05-10 at 3.41.52 PM

melody:

Screen Shot 2015-05-09 at 3.10.15 PM

a spacey grain delayed sawSquare patch: Screen Shot 2015-05-09 at 3.09.55 PMguitar chords LPF’d & detuned more: 

Screen Shot 2015-05-10 at 3.48.04 PM

bit reduced 808: 

Screen Shot 2015-05-09 at 3.09.40 PM

vocoder carrier: 

Screen Shot 2015-05-09 at 3.09.30 PM

vocoder:

Screen Shot 2015-05-09 at 3.09.16 PM

Amour, suppôts et supplications

Amour, suppôts et supplications is a representation of the different phases in a social and/or amorous relationship that we enter into all throughout our lives. Articulated around different movements connected by combinations of recurring timbres, this piece brings about numerous contrasts and emotions.

The inspiration of the piece comes from Chelsea. It is a universal subject everybody is confronted with at one moment. In order to bring this idea to life, we decided to work in two groups: electronic and acoustic. We agreed on a scale to build all our material: an acoustic mode on B. The separation of acoustic/electronic is a representation of the duality in a couple.

Draft of the introduction
Draft of the introduction

 

Harmonic material
Harmonic material

The first part, only acoustic, represents the time of the meeting, and the birth of emotions. Indecision and mystery are the main elements transcribed in the music by imitation between the instruments, melted in a deep reverberation. The following movements represent the battle for power, the sharing of power, the engagement, and (in our case) the distance between the couple with the presence of a perturbing element.

The formal structure of the piece is fixed. During our last working session, we finally put the different elements together: the electronic part (loops composed by Yury and Kristian using Logic Pro, Abelton and Max), and the acoustic part (Chelsea at the electric harp and Jean-Patrick at the prepared electric guitar). We took into consideration at this time the importance of staging the piece and the inclusion of a theatrical aspect to illustrate our inspiration.

Hairy trio
Hairy trio

 

Harp jail
Harp jail

 

Working hard!
Working hard!

The lightning is focused on the instrument in order to depersonalize the musicians, who are the actors of the performance. This depersonalization is a way to universalize the subject. The last movement of the performance is the illustration of the distance and the moment of doubt in the couple. The presence of Yury and Kristian at the end of the piece symbolizes the external factor which can create tension, doubts, and distance in a couple. The simultaneous presence of the two characters involved in the relationship and the two perturbing elements makes us question ourselves. Where are we going? What is part of the external factor in our happiness? What is really important?

 

Video edited by Chelsea Lane

War Water by Mutian, Jake, Steve

Our idea is to trigger MIDI notes with audio using trigg.me (4live.me). We use contact mics on floor tom, snare and kick drum. Steve shared his guitar sample and Jake added a lot effects on it. We spent some time figuring out the best threshold for each mic, so when Steve plays kick drum the sound won’t trigger MIDI note for snare drum. We were inspired by Little Flowers.

The general idea is: War about to happen -> really intense fight -> war ends, silence. Based on the idea we decided the order of how we gonna perform this piece: start -> Modulation sound from light tracking -> pad sound triggered by snare -> pad sound fade out -> floor tom -> guitar sound -> kick tom and more guitar sound -> modulation on guitar sound and pad sound fade in -> speed up -> end.

Technologies used:
Max for Live
Trigg.me
Brainwash
Processing

Screen Shot 2015-05-06 at 1.59.03 PM

Screen Shot 2015-05-06 at 2.00.04 PM

First block at the left is Trigg.me plugin and we can use that to adjust the threshold. ‘Altered Scale’ block in the middle is used for designing pitch pattern. We use guitar sample that Steve recorded and pad as our source sound.

We tried put mics on ride cymbal and hi-hat, but they all have a very long decay time and hard to control. So we decided to use only tom, snare and kick.

Screen Shot 2015-05-06 at 2.00.39 PM

Screen Shot 2015-05-06 at 1.59.33 PM

Screen Shot 2015-05-06 at 2.00.19 PM

Jake also uses Brainwash plugin to trigger the video. Video image changes with overall volume.

Screen Shot 2015-05-06 at 2.00.29 PM

Screen Shot 2015-05-06 at 1.58.31 PM

We tried different sound effects and modulation. Jake connects Ableton with MIDI controller so he can manipulate those sounds while Steve is playing.

We discussed about adding some interactive part and Steve talked about the idea of using flash light to control sound effects. Steve worked on Processing program for light tracking. It plays FM synthesizer developed by Roger Dannenberg. Download the code.

Mutian also worked on max patch for light tracking and track the position of flash light. It works well.

image07

Watch light tracking test video (Max). We decided use Steve’s program for light tracking because it generates some really cool sound.

After setting everything up and our first practice we found tom drum and guitar sound like a ‘War theme’. So we tried different sound effects and modulation to fit in our theme. Jake connects Ableton with MIDI controller so he can manipulate sounds while Steve is playing.

First rehearsal video.

Team Roles

Steve

  • Light Tracking & FM Synth w/ Processing
  • Drums

Jake

  • Audio & Video Effects
  • Music Production Controller (MPC)

Matty

  • Flashlight
  • Light Tracking w/ Max
  • Research
  • Song Arc / Order
  • Documentation

A Singularity

For my final project, I wanted to find a way to combine many of the field recordings I had made in addition to other, much longer sound recordings and experiments I had done over the semester into one final piece.  These recordings consisted of sounds in my studio (an hour long recording of the radiator banging), moments from a Brief History of Time (mostly recordings of Steven Hawking), recordings from the Bayernhof Museum, and live and altered recordings from the robot project I was a part of (recorded afterwards and independent of the scope of the initial project). I had been playing with these sounds for several weeks in Ableton Live and had become interested in the mood and message created when I combined them together.

While I had initially planned to build another robot to play along with the recordings -(triggered by motion or movement) and held within several sculptural containers, once I started adding the recordings into Ableton Live and setting them to the Alias 8 controller, I realized I wanted to be able to have more control over the sounds. After going over some of the past posts Jesse made related to the Kinect, I recalled that when I first came here, one of the initial projects I had wanted to  create was a body controlled sound piece. I had done some explorations with the Kinect before, but hadn’t used it with Max and was excited about the possibilities with Ableton Live through using Synapse. Unfortunately, Synapse can only be ran with the first generation of Kinect (which I thought I had) – so the computer in the sound lab was showing missing areas in Synapse when I tried to run it. The program that offered another way to use Synapse was this one and seemed fairly easy to get up and running:
. Unfortunately, I had issues with that one as well.

Shifting gears, I instead used the dp.kinect external that I had already installed (and reworked the patch). I then added some of the new vizzie effects in Max 7 along with the subpatch that allows changes in amplitude to alter both the delay and zoom of the visuals. After much trial and error, I set the parameters of the kinect so that the delay allowed for a jump in time and place of the captured body to correspond with the mood and message of the samples and effects I had created in Ableton Live.

singularity ableton live patch

singularity max7 patch screenshot

Version of the final performance/installation:

Hopefully I can replace that with a better version tomorrow.

Good Vibrations

The setting is unremarkable: the beds are made, the lights are on, the bathroom is clean, the floors are swept. The room is empty, except for the invasive electrical wires and the sustained vibrating hum that pervades the space.

The installation took place at a DaysInn. Surface transducer speakers were connected to every object in the room, giving “life” to each object and making visible the ordinary. This is documentation of my final project.

Nature Sounds Interacted

Team Members: Yury, Zhiwan, Brittany, Gwen

Our performance was based off of using field recordings of nature (animals, trees, etc.) combined with interactive visuals and a live performance of poetry. The goal was to create a free-flowing, avant-garde performance that was never the same any two times performed.

Roles:

Yury Merman: I found a myriad of high-quality nature sounds online (mainly soundcloud), and chopped up the samples. I processed many sounds as well, either with electronic/synthetic sounding effects, or with mixing effects such as EQ and filtering.

I used Logic Pro for the sound design and editing, and then used Ableton to perform and trigger the sounds.

I had various nature sounds on differing tracks and they would play at random on their own. I could also control the triggering of the sounds, switching up parts sort of like a DJ/producer would do so with a standard electronic track.

I had an outline of the audio structure where it would start slow, build up, have many differing sounds, and then vary throughout until the end of the performance, during which the sounds became more minimal.

Ableton was also connected to MaxforLive, which allowed the audio to trigger effects on the visuals we displayed during the performance. Based off things such as frequencies and intensity, the visuals would change in ways such as frame-rate and color/filters.

 

Zhiwan: Created a Max patch that would allow the audio to trigger the visuals. He also had control all the visuals manually, changing them in real-time during the performance as well.

 

Brittany: Provided video that we used for the visuals, and also performed poetry, samples of which I’ve provided below:

a blindmans eyebrows
condensing the autumn fog
into beads of light
squeezing his eyes shut,
the cat yawns as if about
to eat the spring world.
black winter hills
nibbling the sinking sun
with stark stumpy teeth.
All the haikus are by Richard Wright, an american poet, who wrote these during his last months of life.
Gwen: Also provided visuals and performed the last poem as well. She also lit matches, which gave our performance a more primal aesthetic (Fire = nature).

 

 

 

 

 

 

 

Deux regards perdus vers l’horizon

Deux regards perdus vers l’horizon is a live performance for amplified cello, amplified sitar, and electronics performed by Jake Bernsten, Jean-Patrick Besingrand, Caitlin Quinlan, and Kristian Tchetechko.

After a quick brainstorming session, the idea for a piece mixing instruments from different traditions (classical, indian, electronic) came naturally. The poetic idea of the piece was the start of its composition. After few sketches, we set up a fixed formal structure which allowed us to improvise inside of it. Proceeding this way made the performance more coherent. The formal structure is close to a perfect arch form. The material is derived from the night-time raga Yaman Kalyan, with the note C sharp as a polar reference. After a first section based on this raga, a noisy element is introduced little by little, leading to the middle section based on noisy sounds. After this section, the raga reappears progressively.

 

First beautifully handwritten sketch by Caitlin
First beautifully handwritten sketch by Caitlin

 

IMG_1440
Final score

 

The sitar and the cello are both amplified and proceed into a Max patch. This patch, conceived by Kristian, includes a sample recorder and shuffler for the cello and a randomized pitch delay for the sitar. Both instruments benefit from an strong reverberation.

Jake, the central element of the piece, controls some samples in Ableton through a Midi keyboard allowing him to interact with multiple parameters of sound.

An important element of the performance resides in the visual element coming from the Max patch. The acoustic instruments as well as the samples controlled by Jake are connected into an x-y matrix that visually represents the changing stereo field. This visual element is a concrete representation of the poetic idea of the piece.

Program note:

Deux regards perdus vers l’horizon represents the perturbations lived by two people who little by little take their distance from each other and find themselves back together. This piece depicts different moments of this process. The acoustic instruments are the representation of these two people. The electronic part symbolizes the main interest in common of these two people, which is the basis of the relationship and the basis of the piece.

Video of the performance edited by Kristian:

 

 

The Internet Aesthetic

Team Members: Amanda Marano, Chelsea Lane, Mutian Fu, Jaime Dickerson

Our performance was based on the performance that Jesse showed us in class using the MIDI controller, as well as other electronic pop artists we listened to online, such as Madeon. Our goal was to create a fun upbeat dance piece that was unique every time it was played.

Tools/Roles:

We used three different controllers during our performance. Jaime used a MIDI keyboard of her own connected directly to the mixing board and was outputted to the speakers. The effects were generated by different settings on her keyboard, as well as a voice modifier and a microphone attached to the keyboard. Jaime played this keyboard during the final performance.

Matty was in charge of a MIDI controller with knobs that was connected to a MAX patch that directly controlled the video that she edited of My Keepon dancing from YouTube videos. There were three knobs that controlled the different RGB color levels in the video. If the levels were all set to 1, the video would be restored to its original color levels. She used this board during the final presentation.

Max-patch

IMG_3287

 

This is the original version of the video Matty edited that was used in the Max patch:

Chelsea and Matty created an Ableton Live file that included sound bytes the group got from SoundCloud, including dubstep drum beats and latin rhythms. Matty mapped each different group of sounds onto a different page of a third midi controller, along with different sound effects. One sound can be played from each group, or page, and to that sound any or all effects on that page can be activated or deactivated for that sound.

Screen Shot 2015-03-24 at 11.49.46 PM

Chelsea wrote a rough composition for using this Ableton Live file during performance. Amanda performed using this board in the final presentation, and she used the starting sounds from the composition but deviated from there until the end of the piece.

The composition was of the format of page-clipNumber and initially looked like this:

Intro:

10-1

9-3 (Add soon after 10-1)

8-1 (Add only once music builds in intensity)

We rehearsed with Chelsea using the Ableton MIDI controller before we added the audio effects, and with Amanda using the controller afterwards for the final presentation. Each run through was ultimately completely improvised, with both Jaime and Matty reacting and playing in response to what either Chelsea or Amanda was playing using the Ableton sound clips and effects.

Here’s the final presentation: