How do we recognize real-world objects from the auditory events that they produce? Objects can generate and reflect sound as well as light, and people can recognize objects on the basis of either kind of information. Yet we know almost nothing about auditory object recognition compared with the vast literature on visual object recognition. With funding from the National Science Foundation, Dr. Heller will examine how the human auditory system uses time-varying acoustic information to support the perception of auditory events. In an unprecedented series of experiments, Dr. Heller will bring rigorous experimental methods to bear on the study of real-world auditory events such as breaking glasses and bouncing balls. The experiments are designed to discover the relationship between event properties, such as sound-generating materials and their dynamics, and their resulting event percepts. The aim is to bridge the gap between the detailed knowledge that we have of low-level auditory processing and the more ecologically-valid, but understudied, perception of events in the world. Results from these experiments will serve as the basis for further work on the complex interactions among event properties that give rise to larger-scale events. An additional benefit to the academic community will be the construction of an extensive environmental sound database that will be made freely available over the web.

Agency
National Science Foundation (NSF)
Institute
Division of Behavioral and Cognitive Sciences (BCS)
Application #
0446955
Program Officer
Vincent R. Brown
Project Start
Project End
Budget Start
2005-09-01
Budget End
2009-10-31
Support Year
Fiscal Year
2004
Total Cost
$283,724
Indirect Cost
Name
Brown University
Department
Type
DUNS #
City
Providence
State
RI
Country
United States
Zip Code
02912