Skip to content
MIT Better World

Eran Egozy ’95, MNG ’95, Professor of the Practice in Music Technology. Portrait: Jon Sachs

By Stephanie M. McPherson SM ’11
 

The Course

21M.385 / 6.809: Interactive Music Systems

Instructor

Eran Egozy ’95, MNG ’95
Professor of the Practice in Music Technology

From the Catalog

Interactive Music Systems is a hands-on programming and design course that explores audio synthesis, musical structure, human-computer interaction, and visual presentation as the ingredients for the creation of engaging, real-time interactive musical experiences. These experiences allow users to connect with music more deeply than through passive listening. The most successful ones give users intuitive control, greater musical insight, and a deeper emotional response to the musical experience. Students learn about the principles, design considerations, and aesthetic qualities of inter active music systems by exploring topics such as music perception and audio synthesis, analysis and application of design elements in music games, music visualization, and aesthetic cohesion.

Successful commercial examples include games like Guitar Hero, Rock Band, and Fantasia: Music Evolved.

Venkatesh Sivaraman ’20: “You can design an instrument that requires a lot of effort but is really satisfying to play. And so the goal of our class and the projects we do is to lower that bar. But there’s also a sweet spot. It can’t be too easy.”

The Lectures

Interactive Music Systems has been offered every semester since spring 2015. The course (developed by Professor of the Practice Eran Egozy with support from Leslie Kaelbling, the Panasonic Professor of Computer Science and Engineering) combines computation with music, allowing students to apply what they’ve learned in their coding classes in a new and creative way. Egozy’s career creating wildly popular interactive music games gives students unique and thorough insight into what it takes to develop new musical experiences. Prerequisites include 21M.301 (Harmony and Counterpoint I) and 6.009 (Fundamentals of Programming), so all of the students have some degree of interest in the unique combination of music and technology. A student who took the class in the previous term serves as teaching assistant for the class.

Egozy: There are lots of issues around music that lend themselves to notions of computation. The fact that there are structures and rules means that you can think about it computationally and you could write code or programs that do something with that music.”

Isabel Kaspriskie ’19: “Professor Egozy is both extremely supportive and encouraging, and he definitely imparts his years of experience to the class. The class has a great reputation, and the creative atmosphere there exceeded all of my expectations.”

Every Monday, students meet for lectures on a variety of topics of music technology. Through these lectures and occasional analysis of popular musical games, students come to understand the many components of creating exciting interactive music systems. Occasional guest lectures from industry innovators expose the students to the breadth of opportunities available in the field.

Egozy is the cofounder of Harmonix Music Systems, which developed the video game franchises Guitar Hero and Rock Band. In 2008, Egozy and his cofounder, fellow MIT alumnus Alex Rigopulos ’92, SM ’94, were named to Time’s 2008 list of the 100 most influential people for their work on Rock Band.

Kaspriskie: “In my experience, the best scientists and engineers use creative processes in their technical work. So it makes sense to me that MIT should foster creative thinking both in STEM as well as the humanities.”

The Projects

Wednesdays are reserved for class assignments: problem set demos where students show off the creative part of their assignment, an “exploration presentation” where students explain interesting interactive music systems already in existence, and a lab where students work out problems related to that week’s lecture material. For the final project, students team up to create their own interactive music experience. Over the years, these have included an interactive guitar tutor and a side-scrolling video game that requires players to perform actions on the beat. Interactive Music Systems is just one of several music technology classes enhancing MIT’s already rich array of music offerings. As a result, students who come to MIT for an unparalleled technical education can enjoy world-class training in the arts as well.

Egozy: “The course and assignments challenge students in two ways: first, they have to write the software to create a piece of music technology, and second, they have to use the system they just made in an interesting, creative way.”

Kaspriskie: “My team is working on a visualizer [a feature that generates imagery for music] for Spotify.”

Sivaraman: “We’re planning to build an interactive music generation system using the Kinect [motion sensor], so that you can place objects in a virtual space and move them around to produce different kinds of music.”

Emily Hu ’20 (spring 2019 teaching assistant): “Music has always been pretty important to me because it’s always been a way for me to make friends, connect with people. It’s very universal. I really enjoy bringing that experience to other people as well—which is why this interactivity with music through technology is so interesting.”