How do we recognize real-world objects from the auditory events that they produce? Objects can generate and reflect sound as well as light, and people can recognize objects on the basis of either kind of information. Yet we know almost nothing about auditory object recognition compared with the vast literature on visual object recognition. With funding from the National Science Foundation, Dr. Heller will examine how the human auditory system uses time-varying acoustic information to support the perception of auditory events. In an unprecedented series of experiments, Dr. Heller will bring rigorous experimental methods to bear on the study of real-world auditory events such as breaking glasses and bouncing balls. The experiments are designed to discover the relationship between event properties, such as sound-generating materials and their dynamics, and their resulting event percepts. The aim is to bridge the gap between the detailed knowledge that we have of low-level auditory processing and the more ecologically-valid, but understudied, perception of events in the world. Results from these experiments will serve as the basis for further work on the complex interactions among event properties that give rise to larger-scale events. An additional benefit to the academic community will be the construction of an extensive environmental sound database that will be made freely available over the web.