Selected Projects

Music Technology for Second-Language Learning

It has long been recognized that second-language learners can often benefit from listening to music in their new language and singing along. However, still more benefits could be gained by incorporating technology. For instance, in order to be useful for second-language learning, a given piece of music must be intelligible enough that a non-native speaker can still understand it. I developed a novel test for human subjects to assess the intelligibility of musical excerpts; acoustic features were then identified which were correlated with the intelligibility of those excerpts, and these features were used to create an algorithm for predicting musical intelligibility. I also contributed to a system that automatically assesses the pronunciation of sung lyrics so the students can verify they are pronouncing the words correctly.

Music Recommendation for Motivating Exercise

I worked on the development of a music recommendation engine specifically geared towards providing people with music that would motivate them to exercise. Studies have shown that many people do not exercise as much as necessary for their health, and while music has been shown to motivate people to exercise, computers have difficulty determining the best music to motivate any given patient. This engine used machine learning to quickly learn the musical features corresponding to music that specific users found motivational, then used those features to recommend appropriate music.

Noise-Robust Beat Tracking for Musical Robots

In order to enable automated devices such as robots to perform alongside humans or otherwise participate in musical activities, those devices must be able to detect key features of the music such as beat locations. These algorithms must be robust to both the ego noise produced by the robot and other sources of noise in the environment, in order to accurately guide the robot’s movements. I therefore developed a noise-robust beat tracking algorithm which utilizes machine learning to analyze frames of music and identify recurring elements which are likely to correspond to beats. In testing, this algorithm was discovered to be more accurate than several comparison algorithms on audio that had been distorted both by robot motor noise and by noise gathered from a bar/concert venue.

A Hubo humanoid robot using beat tracking to respond to music, and also adds acoustic ‘clicks’ to indicate precisely where it thinks the beats are located

Emotional Analysis for Robot Dance

Humanoid robots provide an excellent platform to explore creativity and dance in a controllable, repeatable manner. However, as dance is an inherently emotional activity, dancing robots must be able to convey emotions which comport with the mood of the music. I worked on a variety of projects related to this task, including the development of emotion recognition algorithms that a robot could use to evaluate the mood of a given piece of music in order to determine which dance gestures would be appropriate, and a study of multiple humanoid robot platforms that evaluated which kinds of emotions each robot was best able to convey through dance.

Several Hubo humanoid robots dancing in response to music

LiveNote: Orchestral Performance Companion

I made key contributions to LiveNote, an app that follows orchestral performances in real-time so as to stream contextual information to users via their mobile devices. My primary role on this project was the development of the score-tracking algorithm itself. This algorithm uses acoustic features to analyze the audio and forms a profile of the music over time, then dynamically warps that profile in order to compare it to annotated reference recordings and thereby identify the measure that the musicians are playing. This project was performed in conjunction with the Philadelphia Orchestra, which has used it in many subsequent performances. It was also a winner of the 2011 Knight Arts Challenge Philadelphia.

The Orchestral Performance Companion tracking a live performance of a Bach orchestral work

The Hubos Music Video

In order to demonstrate the musical performances made possible by robots, the MET-Lab enabled four Hubo humanoid robots to perform a cover of the Beatles song “Come Together” on specially designed pitched percussion instruments as well as a drumkit. My role in this project was to program the gestures the robots used to strike the instruments, including implementing a tuning feature which allowed each robot to fine-tune its own gestures until its pitch-detection algorithm determined that it was striking the instruments properly and producing the correct notes.

The finished music video

Summer for Music Technology

Every year, the MET-Lab hosts a week-long intensive music technology boot camp for local area high school students called the Summer for Music Technology. I participated in this camp for several years as a lecturer, teaching topics ranging from musical interface design to constructing digital signals from analogue inputs. I also advised students as they completed individual projects utilizing concepts taught in the lectures.

Highlights from the 2013 SMT session