Listeners integrate multisensory (e.g. vision, vestibular) and multisystems (e.g. motor control, feedback) inputs with auditory processing to localize sound sources in the world around them. The proposed research aims at an improved understanding of this process for normal hearing (NH) and hearing impaired (HI) listeners. We will test moderately impaired listeners who wear hearing aids (HA), and profoundly impaired listeners bilaterally implanted with cochlear implants (CI), along with NH listeners for baseline measures. NH listeners use head movements to resolve auditory spatial ambiguities such as front-back confusions, where listeners correctly estimate the angular position of a sound source but cannot reliably determine if it is in front or behind. New data collected by Dr. Pastore (paper in press) show that many CI listeners can use rotational head movements to avoid front-back confusions. Initial pilot data with NH listeners suggest that movement may also improve localization acuity in reverberation. These preliminary data suggest that HA and CI users may use head movements to simplify and parse their auditory environment more effectively than lab estimates under stationary conditions suggest. The knowledge gained from this research should prove valuable to designers of ?smart? processors for cochlear implants and hearing aids. The Spatial Hearing Laboratory is equipped with a programmable, rotating chair and an azimuthal ring of 24 loudspeakers, synchronized to present stimuli based on listeners' angular velocity and position, offering control over several dynamic sensory and systems inputs involved in localizing sounds during listener rotation. In this facility, we will measure two indicators of the benefit of listeners' rotational movement.
The first Aim will continue to investigate CI and HA listeners' ability to resolve front-back confusions in sound source localization tasks using their own movement by measuring the interaction of stimulus duration and listener movement in CI listeners' ability to resolve front-back confusions while rotating. This will offer insights into dynamic auditory spatio-temporal processing in both populations.
The second Aim will investigate NH, CI, and HA listeners' ability to ?learn? the acoustic patterns of a modeled reverberant space and use this to simplify the perceived auditory scene when listeners are rotating versus stationary. We will measure these listeners' ability to group a single simulated direct sound together with a simulated reflection into one perceived auditory object located at or near the location of the first presented stimulus ? the so-called ?buildup of the precedence effect.?

Public Health Relevance

A better understanding of the strategies listeners use to disambiguate complex, challenging auditory environments could provide vital information for refocusing design approaches to the next generations of hearing prosthetics. This project will investigate the effectiveness of two of those strategies: listener motion to avoid front- back confusions, and motion to avoid confusions between sound sources and their reflections in reverberant spaces, in normal hearing and hearing impaired listeners. Understanding these processes could enable adaptive, `smart' algorithmic strategies for hearing aids and cochlear implants that could improve their effectiveness in challenging, dynamic listening conditions.

National Institute of Health (NIH)
National Institute on Deafness and Other Communication Disorders (NIDCD)
Postdoctoral Individual National Research Service Award (F32)
Project #
Application #
Study Section
Special Emphasis Panel (ZDC1)
Program Officer
Rivera-Rentas, Alberto L
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
Arizona State University-Tempe Campus
Other Health Professions
Sch Allied Health Professions
United States
Zip Code