Incredibly excited to feature work from our inter-departmental team spanning AI, ultrasound and physiology that develops novel prosthetics leveraging ultrasound and machine learning. We were just featured last month (December 2017) by NVidia, IEEE, CNN and many other news sources. Our system delivers first-in-class finger-by-finger control by amputees, enabling high-dexterity tasks like playing piano; an impossible feat using today’s sensing on conventional prosthetics. By using ultrasound over traditional sensors like electromyography (EMG), we are able to “see” deeper muscle activities in the arm.
I am one of three team members leading machine learning on this project. My MS Thesis develops the second iteration of this system. I can’t talk too much about it until the thesis is published in May 2018 =D; my approach skips the traditional imaging step and is advised by Gil Weinberg (Georgia Tech), Byron Boots (Google Brain) and Mason Bretan (Futurewei Inc.)