MySong: Automatic Accompaniment for Vocal Melodies
Dan Morris, Microsoft Research
danmicrosoft.comSeminar on People, Computers, and Design
Stanford University May 9, 2008MySong is a system that automatically chooses chords to accompany a vocal melody. A user with no musical experience can create a song with instrumental accompaniment just by singing into a microphone, and can experiment with different styles and chord patterns using interactions designed to be intuitive to non-musicians. Our goal is to let people with no training in chords or harmonization get a taste of songwriting.
In this talk, I’ll describe how MySong works, I’ll discuss results from a recent usability study, and I’ll show lots of audio examples to demonstrate that non-musicians are in fact able to use this system as a powerful creative tool.
Dan Morris is a researcher in the Computational User Experiences (CUE) group at Microsoft Research, focusing on human-computer interaction, with a particular emphasis on creativity support and alternative input systems. He received his PhD from Stanford, where he worked primarily on haptics and physical simulation for virtual surgery. He has also worked on medical devices, particularly human neural prosthetic systems, as an undergraduate at Brown and later as a consulting engineer..
View this talk on line at CS547 on Stanford OnLine or using this video link.
Titles and abstracts for previous years are available by year and by speaker.