The research proposed herein aims at developing and utilizing a multimodal imaging approach to examine the underlying brain mechanism involved in auditory attention. The proposed approach is to conduct the same experiment using functional magnetic resonance imaging (fMRI) and magneto/electroencephalography (M/EEG). This powerful combination utilizes the fine spatial precision of fMRI with the high temporal resolution of M/EEG to map the spatiotemporal dynamics of the cortical networks associated with auditory attention and scene analysis. Currently, however, the acoustical noise associated with fMRI presents a technical hurdle to all auditory studies. Initial steps to characterize the acoustical noise associated with different fMRI pulse sequences are underway. During the mentored phase, the candidate will draw on his signal processing expertise to develop a noise masking protocol that will psychoacoustically control forthe auditory environment during fMRI scanning while mitigating the technical challenges in MR image reconstruction associated with the proposed auditory-amicable fMRI pulse sequence. In later stages, two experiments will be carried out examining how the prefrontal cortex is differentially involved when subjects are directed to attend different cues of the auditory stimulus. We will determine what """"""""biomarkers"""""""" can be extracted from the M/EEG signals that are associated with the listener's attentional states. The project fits the candidate's long-term career goal of establishing a high-quality independent research program that combines engineering and neuroscience approaches in a synergistic manner to characterize the """"""""biomarkers"""""""" that are associated with auditory scene analysis. This work will facilitate the candidate's immediate goals of becoming an expert in multimodal imaging while bringing to the field his knowledge in speech and hearing sciences, particularly his quantitative psychophysics training. The independent phase of the NIH Pathway to Independence Career Development award will be carried out at the LIniversity of Washington, Seattle, where the candidate will begin his career as assistant professor on January 1, 2011.
Current hearing aid users find it hard to understand speech in noisy environment. The long-term objective of this work is to capture the attentional-state of the user and design a next-generation hearing aid that will dynamically follow and select the signal of interest to be amplified.
|Wronkiewicz, Mark; Larson, Eric; Lee, Adrian K C (2015) Leveraging anatomical information to improve transfer learning in brain-computer interfaces. J Neural Eng 12:046027|
|Lee, Adrian K C; Larson, Eric; Maddox, Ross K et al. (2014) Using neuroimaging to understand the cortical mechanisms of auditory selective attention. Hear Res 307:111-20|
|Larson, Eric; Lee, Adrian K C (2014) Potential Use of MEG to Understand Abnormalities in Auditory Function in Clinical Populations. Front Hum Neurosci 8:151|
|Bharadwaj, Hari M; Lee, Adrian K C; Shinn-Cunningham, Barbara G (2014) Measuring auditory selective attention using frequency tagging. Front Integr Neurosci 8:6|
|Larson, Eric; Maddox, Ross K; Lee, Adrian K C (2014) Improving spatial localization in MEG inverse imaging by leveraging intersubject anatomical differences. Front Neurosci 8:330|
|Larson, Eric; Lee, Adrian K C (2014) Switching auditory attention using spatial and non-spatial features recruits different cortical networks. Neuroimage 84:681-7|
|Maddox, Ross K; Pospisil, Dean A; Stecker, G Christopher et al. (2014) Directing eye gaze enhances auditory spatial cue discrimination. Curr Biol 24:748-52|
|Larson, Eric; Lee, Adrian K C (2013) The cortical dynamics underlying effective switching of auditory spatial attention. Neuroimage 64:365-70|
|Larson, Eric; Lee, Adrian K C (2013) Influence of preparation time and pitch separation in switching of auditory attention between streams. J Acoust Soc Am 134:EL165-71|
|Maddox, Ross K; Cheung, Willy; Lee, Adrian K C (2012) Selective attention in an overcrowded auditory scene: implications for auditory-based brain-computer interface design. J Acoust Soc Am 132:EL385-90|
Showing the most recent 10 out of 13 publications