Instruments Sans Frontieres Prototype I

I had the wonderful opportunity of working with Vrunda Patel this past spring break. Vrunda and her family moved to the States to seek treatment for cerebral palsy and we were put in contact through a professor at the Yale School of Public Health. It was such a wonderful experience to work with Vrunda!

I made a working prototype using gloves and leggings attached with flex sensors. In this demo, Vrunda chooses the note by flexing her arms and triggers the note by pulling her index finger. The idea is to disembody instrumental control so that parameters such as note selection ad note volume can be controlled in unconventional ways. If a patient has difficulty moving their fingers, why not use elbow motion instead? The device worked really well when I demonstrated it, but I learnt that many things I take for granted as a musician do not immediately transfer over to the end-user. Vrunda explained that controlling notes “in mid air” with no tactile feedback was actually very confusing and unintuitive. Moreover, the Max Patch assumes that the user understands a piano layout. Vrunda doesn’t read music or play the piano.

For the next iteration, I will feature a large user GUI that contains both the note letter name, the piano roll and the notated music. Vrunda’s point about haptic feedback was also important. As a performer who regularly uses this kind of musical HCI, I am able to play the set up with no problem. To me, it makes sense that flexing different parts of my body changes the notes because, well, I designed it that way. But for non-musician users, the entire concept of playing an instrument by flexing one’s arms is not as straightforward as I assumed. The interface design and delivery to the end user is more important that I previously anticipated.

I am interested in applying the innovations in Musical HCI in the context of physical disabilities. Imogen Heap’s gloves, Elena Jessop’s Vocal gloves and MIT Opera of the Future’s “Disembodied Performance” have explored the ways that sensors can be used to turn human motion and gestures into expressive music. How can we apply these technologies in the context of limited movement? If the patient has limited control of their arms and is thus unable to play an acoustic instrument, how can we use these interfaces to extract the maximum amount of musical control? How can we empower handicapped people with musical expression?

With this in mind, I found Imogen Heap’s Kickstarter campaign for her Mi.Mu gloves to be very exciting! It’s so great to see Imogen Heap putting out a tried-and-tested product out into the market and I have ben very inspired by her team’s designs.

Vrunda also liked the Midi Fighter I brought. I am thinking of building a device like the body the of a guitar that features a resonant body and sound post. The sound will be made by EM actuators and the arcade buttons will be attached to the device. Thus, the user will be able to feel the instrument vibrate like a real instrument. This will hopefully make the device more of a “tangible” experience.

2014-03-08 17.06.07

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s