Social Networks

Mattias Russo-Larsson • MRL Design | Slack~Sounds
15729
portfolio_page-template-default,single,single-portfolio_page,postid-15729,ajax_fade,page_not_loaded,,side_area_uncovered_from_content,qode-theme-ver-13.1.2,qode-theme-bridge,wpb-js-composer js-comp-ver-5.4.5,vc_responsive
 

Slack~Sounds

 

Slack~Sounds

Slack~Sounds was my thesis project at the UCLA Design & Media Arts program. An interactive musical performance that fuses athletics, music, interactive interfaces.

MORE

Slacklining started in the mid 1970s when climbers in yosemite began tie-ing their lines between trees and balancing on them during down time. Slacklining is basically the act of balancing on a narrow piece of webbing that has dynamic movement. Its not a unique realization that music and slack lining are inherently compatible. its part of the culture to listen to music while slacklining. Like playing an instrument slacklining requires the pinnacle of focus between mind and body. And like playing music slacklining requires that you quiet your critical thinking and move into the meditative state. Music while slacking helps because it draws your attention away from distracting thoughts allowing you to feel and react instinctually to the line instead of thinking analytically. While talking with my friend Luis Talavera, who introduced me to slacklining, we came up with idea of turning a slackline into and instrument by tracking the motion of the line while walking on it and converting that data into frequency and amplitude. It seemed like the perfect symbiotic process to enhance the process of slacklining. Hence Slack-sounds was born.

The first step in the project was getting the data of the slackline’s motion. The leap motion device uses two infrared camera’s to detect motion in the X, Y, & Z places, with astounding precision. The only complication with it is that it is specifically programmed to detect hands. Hence the creepy hand on the line. I then use a graphical programming software called Max that allows me to unpack the data from the Leap and build a patch that uses the data to select different notes in a scale. I run this patch from ableton live which allows me to specify different parameters in the synthesizer that I want to change in either the X, or Y axis. Pitch and rhythm are controlled by movement in the Y axis and the timbre of the sound is controlled by modulation mapped across the X axis. Higher pitches and faster rhythms occur when the hand moves up and lower pitches and slower rhythms occur when the hand moves down. The notes I make from the slackline all sound good together, because my motion isn’t mapped linearly to play all chromatic pitches, but rather only the notes I select in a step sequencer. If you imagine piano keys as stairs, the program uses the position of the hands height to determine which step or note to jump to. Every note is quantized which is to say that it always plays in sync with the tempo that I determine. This allows me to seamlessly play along with any song I like by setting the global key and tempo to match the song of my choice.