When people ask what I study and I answer a Double Major in Physics and Music, the response is is always one of the following two.
The first:
“Oh man, that’s a really interesting combination, they are so fundamentally connected!”
But more often the second:
“Physics and Music? Aren’t they like the complete opposites of one another?”
How can a physical science be so related to music? Isn’t physics about equations and integrals while music is about composing and artistic expression? Yes it is, but that’s just a subset of what those respective disciplines are about.
Edgard Varèse, a prominent 20th century French compose once said:
“Music is organized sound” – Edgard Varèse
Traditionally, musicians and academics interpret Varèse’s quote in the way he intended – from an artistic perspective. Music is indeed organized sound, a notion that 20th century composers like John Cage would push to become compositions that featured a succession of household noises in Cage’s Waterwalk (1960).
To me, Varèse’s quote resonates on a more technological perspective. If one thinks of music as not just organized sound, but a music as a subset of a larger natural phenomenon called sound, then the links between Music, or sound, and many other STEM disciplines like Physics, Electrical engineering and Computer Science, become clear.
Innovations in sound and audio technology, from the electric guitar to the microphones and speakers ubiquitous to our smartphones, tablets and computers were all invented by Physics, Electrical Engineers, Computer Scientists working at the intersection of STEM and Sound.
Speaker and headphones companies like Bose, Senheiser, Beats, Sony and many more rely on sound reproducing techniques.
Microphone companies like Rode, Neumann and Blue also rely sound capturing technologies from this field.
Even apps like Shazam require knowledge in signal processing.
That crystal clear voice from a recent Skype call? Also signal processing.
The entire video and audio industry, from news broadcasting to Hollywood movie sets, require embedded hardware systems in both studio and mobile format for capturing audio.
Music software like Ableton, Logic Pro and plug ins from Eventide, Native Instruments and Waves all rely on programmers who apply the mathematical rules of sound propagation into the digital domain.
Do composers and pianists invent these tools? No. Researchers from STEM passionate about sound and music did. It is this intersection of disciplines that I am very excited about.
In fact, all the tools that have completely turned musical paradigms on its head were invented by innovators in the physical sciences and engineering.
The electric guitar was invented by Adolph Rickenbacker (NOT LES PAUL), an electrical engineer. Think of how the electric guitar made music by the Beatles and Jimi Hendrix possible in the first place. Now, step back and think larger than just music, think of the social movements that musicians like John Lennon connected were able to connect with on this instrument.
The first phonograph, which would pave way for the gramophone, was invented by Thomas Edison, an engineer among other things.
Electret microphones were invented by Bell Laboratories, a research institution composed of the brightest of physicists and electrical engineers (who also invented the transistor and laser and are responsible for 8 Nobel Prizes)
Doctors didn’t invent the MRI, PT and CT scan. Physicists and engineers invented these essential tools that have revolutionized the medical profession, and we’ve seen, the industry of sound. Since I’m also going to graduate school, there has been a burgeoning of academic fields in this intersection:
-Architectural Acoustics
-Instrument Acoustics
-Algorithmic Composition
-Speaker physics
-Microphone physics/engineering
-Signal Processing
-Musical Human Computer Interaction
Universities like Stanford, Georgia Tech, Carnegie Mellon, Dartmouth, McGill, and so many more now offer graduate and PhD programs in this field. It’s called different things at different institutions, but it generally comes under “Music Technology” or “Acoustics” or “Music Engineering” or “Digital Musics”.
This recent presentation at the Grove, New Haven, CT talks about this idea of STEMusic in the context of my research. I go through my senior thesis in Applied Physics on surrogate soundboards, my musical HCI research using the Leap Motion, new musical interfaces and also a modular Seung (northern Thai guitar) I am working on in Mechanical Engineering.
Hopefully this paints a picture of STEM and Music being a very exciting and large field of study. And in case it’s still lingering somewhere in your mind, I am not studying to become an audio engineer! That’s a different field completely! I am not learning how to use studio equipment and record the next Ariana Grande and go multi-platinum. Although in an alternate life that would be great and I would make lots money…
I want to invent the new audio and sonic tools that people from all walks of life will use and enjoy – be it a musician, composer, media broadcaster, app user or regular listeners.
Hanoi – This is awesome stuff! Thanks for taking the time to break down the complex world into digestible pieces for us laymen! I’m glad you’re continuing to pursue your passion – it’s inspiring to watch. Keep up the hard work. – Michael