
Eran Egozy and Anna Huang in the Thomas Tull Concert Hall.
PHOTO: SIMON SIMARD
Hint: The class is 21M.569 Algorithms and Interactions in Human-AI Partnerships, which debuted in fall 2024 and will be offered in the new, multidisciplinary MIT Music Technology and Computation Graduate Program. Its instructor is Anna Huang SM ’08, the Robert N. Noyce Career Development Professor of Music & Theater Arts and Electrical Engineering and Computer Science. Huang is also a composer who created the machine-learning model Coconet that powered the Bach Doodle, Google’s first AI-powered doodle (an interactive variation of the Google logo on its homepage), in 2019.
If you guessed that the final project in question explored ways to use the human body as an instrument in both physical and virtual machine-learned spaces, you’d be right. The possibilities for unique research and composition pathways like that, says Huang, are part of what makes the Graduate Program in Music Technology and Computation so exciting—and why it’s nearly impossible to anticipate the dynamic work that students will produce.
A rare, dedicated music technology program
An interdisciplinary collaboration between Music & Theater Arts and the School of Engineering, the program includes two tracks toward a master’s degree: one as a fifth-year master’s for MIT undergraduates, which will begin study in fall 2025, and one open to all applicants beginning in fall 2026. “There’s been a long tradition of musicians working with computers, figuring out how to digitally sample and mathematically create new sounds,” says Huang, who holds a shared faculty position between the MIT Music & Theater Arts Section and the MIT Stephen A. Schwarzman College of Computing. But music technology extends beyond using computers to synthesize sounds.
“We also teach machines to listen, so that they can jam with other musicians and respond to what they’re playing, respond to the musicians they are working with,” she says, noting her time as a student at the MIT Media Lab, where she worked with Professor Emeritus Barry Vercoe, a pioneer in digital audio processing, in his Music, Mind, and Machine Group. Eighteen years later, Anna has started a new research lab at MIT called Human-AI Resonance (HAI-Res, pronounced “high res”). “My current research focus is interaction-driven generative AI. I’m interested in thinking about creativity not through imitation, but as a collaborative process where creative ideas emerge through human-AI interaction,” she says. As a composer, she writes both acoustic and electronic music and plays multiple instruments, including the guzheng, a table harp also known as a Chinese zither.
Not many universities have a dedicated music technology program, more typically employing one or two faculty members with area expertise. “Now students will be coming to MIT because music tech is their center of focus,” she says. “Having that support from MIT is exciting—there are very few universities with this kind of space for music technology and computation.”
Eran Egozy ’93, SM ’95, Professor of the Practice, looks forward to building the music technology faculty with more multidisciplinary experts like Huang and attracting applicants with unique interests. “We’re getting inquiries from potential applicants who are musicians or composers and also have a strong technical sense and computational knowledge,” he says. “It will be super cool to see what those people can do when they’re on campus, and how it changes the musical and research landscape here.”
A musical foundation
“Even before this program started, undergraduates who took my music technology class would oftentimes tell me it’s been their favorite class at MIT so far, and I think that’s because it employs both sides of their personalities,” Egozy says. He had a similar experience as an undergraduate at MIT. While music technology courses were not regularly offered at that time, his experience with musical instruction and music technology at the MIT Media Lab paved the way for him to cofound Harmonix Music Systems, which created the groundbreaking technology behind the video game franchises Guitar Hero and Rock Band.
He stresses that the strength of the existing music courses and faculty members is what makes it possible to build a high-caliber program, and that the resources in the new Edward and Joyce Linde Music Building create a strong base for the program. Huang appreciates the sense of community in the music building as well. “It’s great to be back on campus again, and the social experience with students and colleagues in the new building, even just being able to just walk down the hallway and see who’s there and what they’re working on, is a big part of that,” she says.
The music technology labs and acoustically advanced performance spaces in the new building, Egozy says, in addition to MIT Schwarzman College of Computing and School of Engineering facilities, will be critical to the new program. “The foundation for this new graduate program is music, and adding technology creates exciting new areas for research,” Egozy adds. “That’s why MIT is the right place. It has both ingredients.”
THE HUMAN ELEMENT
Mens et Manus et Music
A tour of the new Edward and Joyce Linde Music Building at MIT
THE HUMAN ELEMENT
Digital Instruments for Musical Togetherness
Engineering graduate student Joseph Ntaimo ’23 builds instruments that “empower people to jam with each other”