In the everyday world listeners and sound sources move, which presents a challenge for localizing sound sources. The auditory spatial cues (e.g., interaural differences) used for locating sound sources change based on the relative relationship between the position of sound sources and that of listeners. For instance, when listeners move, the auditory spatial cues indicate that the sound source moved, even if it was stationary. In the everyday world a stationary sound source is not perceived as moving when a listener moves. How does the auditory system determine that a stationary sound source does not move when listener movement caused the auditory spatial cues to change? This is the basic challenge this proposal addresses. A hypothesis, based on research in vision, is that sound source localization is based on the interaction of two sets of cues, one based on auditory spatial cues and another that indicates the position of the head (or body or perhaps the eyes). Both sets of cues are required for the brain to determine the location of sound sources in the everyday world.
Two Specific Aims i nvestigate this hypothesis:
Aim 1 involves experiments examining the role of auditory spatial cues and their interaction with head-position cues when listeners and/or sound sources move.
Aim 2 is concerned with cues used to determine the position of the head (or body or eyes) when listeners and/or sound sources move and listeners are asked to make sound source localization judgments. A unique facility is used in which listeners rotate and listen to sounds presented from multiple sources. New psychophysical procedures have been developed to study this much under-researched topic, i.e., sound source localization when listeners and/or sound sources move. The hypothesis leads to a conclusion that sound source localization is a multisystems process, and as a consequence assisting people to localize sound sources cannot be based just on understanding the processing of auditory spatial cues.

Public Health Relevance

Locating the position of a sound source based only on sound affords a listener several benefits: listeners can determine the location of objects they cannot see (e.g., objects behind them), they can use sounds from various sources to navigate in complex environments, they might gain their balance when they can localize sound sources when visual cues are absent, they can process sounds of interest in reflective and reverberant spaces, and they can understand a target sound in a background of interfering sounds when the target and interfering sounds come from spatially separated sound sources. This proposal investigates sound source localization when listeners and sound sources move, as often occurs in the everyday world, in order to better understand sound source localization processing and ways to improve the benefits such processing might provide for people with sensory disorders.

Agency
National Institute of Health (NIH)
Institute
National Institute on Deafness and Other Communication Disorders (NIDCD)
Type
Research Project (R01)
Project #
5R01DC015214-05
Application #
9956602
Study Section
Auditory System Study Section (AUD)
Program Officer
King, Kelly Anne
Project Start
2016-07-01
Project End
2021-06-30
Budget Start
2020-07-01
Budget End
2021-06-30
Support Year
5
Fiscal Year
2020
Total Cost
Indirect Cost
Name
Arizona State University-Tempe Campus
Department
Other Health Professions
Type
Sch Allied Health Professions
DUNS #
943360412
City
Tempe
State
AZ
Country
United States
Zip Code
85287
Pastore, M Torben; Natale, Sarah J; Yost, William A et al. (2018) Head Movements Allow Listeners Bilaterally Implanted With Cochlear Implants to Resolve Front-Back Confusions. Ear Hear 39:1224-1231
Yost, William A (2018) Auditory motion parallax. Proc Natl Acad Sci U S A 115:3998-4000
Yost, William A (2017) Sound source localization identification accuracy: Envelope dependencies. J Acoust Soc Am 142:173
Pastore, M Torben; Yost, William A (2017) Spatial Release from Masking with a Moving Target. Front Psychol 8:2238
Yost, William A (2017) Spatial release from masking based on binaural processing for up to six maskers. J Acoust Soc Am 141:2093
Yost, William A (2016) Sound source localization identification accuracy: Level and duration dependencies. J Acoust Soc Am 140:EL14
Loiselle, Louise H; Dorman, Michael F; Yost, William A et al. (2016) Using ILD or ITD Cues for Sound Source Localization and Speech Understanding in a Complex Listening Environment by Listeners With Bilateral and With Hearing-Preservation Cochlear Implants. J Speech Lang Hear Res 59:810-8
Dorman, Michael F; Loiselle, Louise H; Cook, Sarah J et al. (2016) Sound Source Localization by Normal-Hearing Listeners, Hearing-Impaired Listeners and Cochlear Implant Listeners. Audiol Neurootol 21:127-31