Human-Computer Interaction Granular Synthesis and Localization of Sound

This demo is a culmination of the projects I’ve been working on this past semester. It uses code that I’ve been working on for the Leap Motion for gestural input, a Granular Synthesizer I made in Max and a Hemispherical Speaker that I built.

As explained in the video, I use the Leap Motion to do two things. First, it tracks the position of my left hand and maps it to a “Map” I’ve made in Max so that when I hover my left hand over the top speaker in the map, sound comes out of the top speaker on the hemispherical speaker etc. Vertical motion has been mapped to overall volume. This allows dynamic and gestural control over the localization of the sound coming out of the speaker.

Secondly, the Leap Motion tracks the motion of my right hand and controls a motion activated granular synthesizer. Horizontal displacement allows the user to scrub through the sample and vertical displacement controls the grain length. Thanks to the accuracy of the Leap Motion, I could also track the motion of individual fingers and use each of my fingers to trigger an individual grain. This produces a pleasantly “organic” sound to the granular synthesizer when compared to traditional granular synthesizers that trigger grains based on a periodic function. Although traditional granular synthesizers can introduce “randomness” to a periodic function, a Human-Computer Interaction (HCI) granular synthesizer provides a more dynamic and intuitive control of the sound.

When  two hands are used together, the user can use purely human gestures to control and dynamically modulate a 5 grain granular synthesizer and the localization of the produced sound! The HCI aspect of this demo has exciting applications for performance in live settings and more importantly, making this technology more accessible to user. After this demo for example, a composition student approached me saying that he had never used granular synthesis, but would love to use this particular set up in a future work!

When we play an acoustic instrument such as a cello, we are in tactile contact with the string and our physical movements are inextricably connected with the produced sound. With electronic sounds however, our interactions become limited to knobs, faders and sometimes multitouch interfaces – resulting in a degree of “disconnectedness” with the produced sound. Using the motion tracking provides a new dimension of “tactile” interaction with electronic sounds through intuitive control using human gestures. This reminded me of the reaction that many of my friends had with my hand-waving dubstep. Put Ableton in front of someone and say “make some dubstep” and they would probably walk away, but use the right tools to put  “control” into the hands of the user through HCI makes the experience a completely different and enjoyable one!

1 Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s