Instruments Sans Frontières is an umbrella term for a set of projects I am working on. The video above was a short presentation I did for the Global Health Hackathon 2014 at Yale University’s Center for Engineering Innovation and Design on 24/1/2014.
Instruments Sans Frontières aims to overcome the limitations of acoustic instruments for the disabled by developing a novel musical interface employing three key technologies: a.) Sensor and motion tracking technology b.) Software Physical Models of instruments (DSP) and c.) Specially designed electro-acoustic resonators.
The project will work closely with patients and doctors at the School of Medicine to develop a set of wearable devices using arduinos, flex sensors, force sensors, accelerometers and breath sensors in addition to the Microsoft Kinect and Leap Motion to control parameters such as note selection and timbre in a software instrument. The generated sound will be delivered through an “electro-acoustic” resonator that mimics the soundboard of a real instrument. The end product will be an electro-acoustic instrument that can accommodate various disabilities and leverages the exciting marriage between modern sensors and signal processing with centuries of instrument refinement and design.
Music therapy has shown promising results in patient recovery. Imagine the positive effects that a disabled patient may gain through “active music therapy” by taking part in the creation of music. Loss of motor skills, especially in hand and finger coordination can make it impossible for a patient to play an acoustic instrument. Moreover redesigning the mechanisms of instruments to fit the multitude of disabilities is a very difficult and often impossible task. How can we empower disabled patients with musical expression?
We thus turn our attention to software instruments. By combining our understanding of instrument acoustics and digital signal processing, software physical models of instruments have been created that sound incredibly similar to the real instrument and have garnered commercial success in the music industry. In the software domain, instrumental parameters can also be accurately controlled using sensors and motion tracking devices and can thus distribute instrumental control to other functioning parts of the body. If the patient can still flex their knees, why not use this for note selection? Research by others in this area includes the modification of acoustic instruments and using webcams to track head movement. However, the hardware-software approach using modern sensors and software instruments in this project has yet to be explored. Moreover, the presence of a physical instrument (acoustic resonator) provides a much more intimate experience and greater emotional connection than sound produced from a speaker.