In order to dynamically follow different conversations in a crowded environment, we must constantly direct attention to the auditory signal of interest and segregate sounds that originated from other sources. Normal- hearing listeners can achieve this task seamlessly, but hearing-impaired listeners, cochlear implant users, and individuals with (central) auditory processing disorders often find communicating in this everyday acoustic environment challenging. The long-term objective of this research is to characterize the cortical dynamics associated with different aspects of auditory attention and to incorporate these brain signals in a next- generation hearing assistive device that helps account for the listener's attentional focus. The current project is built on the hypothesis that there are distributed cortical regions that coordinate top-down and bottom-up auditory attention, and that these regions are functionally coupled to the auditory sensory areas differently depending on the task at hand. The brain dynamics associated with auditory attention are currently not well understood, and thus a necessary first step to achieve our long-term objective is to study the attentional network in normal-hearing listeners.
The specific aims of this project seek to identify differences between the cortical regions recruited for attention based on spatial and non-spatial features (Aim 1), as well as how the rest of the cortex compensates when the peripheral auditory signal is degraded by simulating the reduction in spectrotemporal acuity experienced by listeners with hearing impairments and cochlear implant users (Aim 2). Furthermore, we propose to take a systems-level approach and investigate how other cortical regions communicate with the auditory sensory areas in order to coordinate switching and maintenance of attention in a crowded acoustical scene (Aim 3). Our proposal emphasizes designing behavioral paradigms that bridge the gap between psychoacoustics and neuroimaging research, thereby addressing how different regions of the cortex act in concert to aid us in communicating in everyday settings.

Public Health Relevance

Listeners with hearing loss as well as users of hearing aids and cochlear implants often find it difficult to communicate in a crowded acoustic environment. By understanding how our brain can selectively attend to sounds of interest, we hope to inspire new signal processing techniques and aural rehabilitation strategies that would alleviate this daunting task of social interactions often faced by this population in everyday life.

Agency
National Institute of Health (NIH)
Institute
National Institute on Deafness and Other Communication Disorders (NIDCD)
Type
Research Project (R01)
Project #
1R01DC013260-01A1
Application #
8644013
Study Section
Auditory System Study Section (AUD)
Program Officer
Donahue, Amy
Project Start
2013-09-17
Project End
2018-08-31
Budget Start
2013-09-17
Budget End
2014-08-31
Support Year
1
Fiscal Year
2013
Total Cost
$328,313
Indirect Cost
$115,813
Name
University of Washington
Department
Other Health Professions
Type
Schools of Arts and Sciences
DUNS #
605799469
City
Seattle
State
WA
Country
United States
Zip Code
98195
Maddox, Ross K; Lee, Adrian K C (2018) Auditory Brainstem Responses to Continuous Natural Speech in Human Listeners. eNeuro 5:
Atilgan, Huriye; Town, Stephen M; Wood, Katherine C et al. (2018) Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding. Neuron 97:640-655.e4
McCloy, Daniel R; Lau, Bonnie K; Larson, Eric et al. (2017) Pupillometry shows the effort of auditory attention switching. J Acoust Soc Am 141:2440
Bizley, Jennifer K; Maddox, Ross K; Lee, Adrian K C (2016) Defining Auditory-Visual Objects: Behavioral Tests and Physiological Mechanisms. Trends Neurosci 39:74-85
McCloy, Daniel R; Larson, Eric D; Lau, Bonnie et al. (2016) Temporal alignment of pupillary response with stimulus events via deconvolution. J Acoust Soc Am 139:EL57-62
Maddox, Ross K; Atilgan, Huriye; Bizley, Jennifer K et al. (2015) Auditory selective attention is enhanced by a task-irrelevant temporally coherent visual stimulus in human listeners. Elife 4:
McCloy, Daniel R; Lee, Adrian K C (2015) Auditory attention strategy depends on target linguistic properties and spatial configuration. J Acoust Soc Am 138:97-114
Larson, Eric; Lee, Adrian K C (2014) Potential Use of MEG to Understand Abnormalities in Auditory Function in Clinical Populations. Front Hum Neurosci 8:151
Larson, Eric; Maddox, Ross K; Lee, Adrian K C (2014) Improving spatial localization in MEG inverse imaging by leveraging intersubject anatomical differences. Front Neurosci 8:330
Maddox, Ross K; Pospisil, Dean A; Stecker, G Christopher et al. (2014) Directing eye gaze enhances auditory spatial cue discrimination. Curr Biol 24:748-52