The objective of this project is to develop a robotic musician that will create meaningful and inspiring musical interactions with humans, leading to novel musical experiences and outcomes. The robot will combine computational modeling of music perception, interaction, and improvisation, with the capacity to produce melodic and harmonic acoustic responses in physical and visual manners. The PI's underlying hypothesis is that real-time collaboration between humans and computer-based players can capitalize on the combination of their unique strengths to produce new and compelling music. The project, therefore, aims to combine human creativity, emotion, and aesthetic judgment with the algorithmic computational capability of computers. The PI believes that a perceptual and improvisatory robot will best facilitate such interactions by bringing the computer into the physical world both acoustically and visually. Unlike computer- and speaker-based interactive music systems, a physical anthropomorphic robot will create familiar, acoustically rich, and visual interactions with humans. In order to create intuitive as well as inspiring social collaboration with people, the robot will analyze live music based on computational models of human perception, and will generate algorithmic responses that are humanly impossible. Building on and extending the PI's previous work on a perceptual robotic drummer that was primarily focused on rhythm, this project will develop a robot that can listen to, analyze, and play melodic and harmonic music as well. It will use a four-arm mechanism to play the marimba (a melodic mallet-based instrument), and infer musical meaning from live input based on a set of cognitive models of musical percepts such as melodic attraction, tension, and similarity. Based on this analysis, the robot will generate musical responses informed by mathematical constructs such as fractals, cellular automata and genetic algorithms. The interaction schemes to be developed for the robot will include synchronous and sequential operations, and will address aspects such as beat tracking and style adaptation. Project outcomes will include fundamental contributions to music perception and cognition, human-robotic interaction, computer assisted collaboration, and improvisation.

Broader Impacts: Building on the engaging power of music, this project will help bring the exciting potential of HRI to the attention of the general public through workshops and high visibility concerts, which will be designed in particular to capture the interest and imagination of students who are not regularly drawn to music, mathematics, engineering, and the sciences. Within the HRI research community, project outcomes relating to sound perception, and to the integration of human expression, emotion, and aesthetics with robotic analytical and mechanical capabilities, will help researchers develop productive human-robot collaborations in a variety of non-musical domains. The PI expects the project will also play a pivotal role for the research center in music technology that he has initiated at Georgia Tech.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
0713269
Program Officer
Ephraim P. Glinert
Project Start
Project End
Budget Start
2007-10-01
Budget End
2011-09-30
Support Year
Fiscal Year
2007
Total Cost
$490,274
Indirect Cost
Name
Georgia Tech Research Corporation
Department
Type
DUNS #
City
Atlanta
State
GA
Country
United States
Zip Code
30332