In order to dynamically follow different conversations in a crowded environment, we must constantly direct attention to the auditory signal of interest and segregate sounds that originated from other sources. Normal- hearing listeners can achieve this task seamlessly, but hearing-impaired listeners, cochlear implant users, and individuals with (central) auditory processing disorders often find communicating in this everyday acoustic environment challenging. The long-term objective of this research is to characterize the cortical dynamics associated with different aspects of auditory attention and to incorporate these brain signals in a next- generation hearing assistive device that helps account for the listener's attentional focus. The current project is built on the hypothesis that there are distributed cortical regions that coordinate top-down and bottom-up auditory attention, and that these regions are functionally coupled to the auditory sensory areas differently depending on the task at hand. The brain dynamics associated with auditory attention are currently not well understood, and thus a necessary first step to achieve our long-term objective is to study the attentional network in normal-hearing listeners.
The specific aims of this project seek to identify differences between the cortical regions recruited for attention based on spatial and non-spatial features (Aim 1), as well as how the rest of the cortex compensates when the peripheral auditory signal is degraded by simulating the reduction in spectrotemporal acuity experienced by listeners with hearing impairments and cochlear implant users (Aim 2). Furthermore, we propose to take a systems-level approach and investigate how other cortical regions communicate with the auditory sensory areas in order to coordinate switching and maintenance of attention in a crowded acoustical scene (Aim 3). Our proposal emphasizes designing behavioral paradigms that bridge the gap between psychoacoustics and neuroimaging research, thereby addressing how different regions of the cortex act in concert to aid us in communicating in everyday settings.
Listeners with hearing loss as well as users of hearing aids and cochlear implants often find it difficult to communicate in a crowded acoustic environment. By understanding how our brain can selectively attend to sounds of interest, we hope to inspire new signal processing techniques and aural rehabilitation strategies that would alleviate this daunting task of social interactions often faced by this population in everyday life.
|McCloy, Daniel R; Larson, Eric D; Lau, Bonnie et al. (2016) Temporal alignment of pupillary response with stimulus events via deconvolution. J Acoust Soc Am 139:EL57-62|
|Bizley, Jennifer K; Maddox, Ross K; Lee, Adrian K C (2016) Defining Auditory-Visual Objects: Behavioral Tests and Physiological Mechanisms. Trends Neurosci 39:74-85|
|McCloy, Daniel R; Lee, Adrian K C (2015) Auditory attention strategy depends on target linguistic properties and spatial configuration. J Acoust Soc Am 138:97-114|
|Maddox, Ross K; Atilgan, Huriye; Bizley, Jennifer K et al. (2015) Auditory selective attention is enhanced by a task-irrelevant temporally coherent visual stimulus in human listeners. Elife 4:|
|Larson, Eric; Maddox, Ross K; Lee, Adrian K C (2014) Improving spatial localization in MEG inverse imaging by leveraging intersubject anatomical differences. Front Neurosci 8:330|
|Maddox, Ross K; Pospisil, Dean A; Stecker, G Christopher et al. (2014) Directing eye gaze enhances auditory spatial cue discrimination. Curr Biol 24:748-52|
|Larson, Eric; Lee, Adrian K C (2014) Potential Use of MEG to Understand Abnormalities in Auditory Function in Clinical Populations. Front Hum Neurosci 8:151|