Everyday listening environments often comprise noise and acoustic reflections off surrounding surfaces, together known as reverberation. While normal-hearing listeners can converse in such environments with little effort, those with hearing loss find them challenging. Despite the ubiquity of reverberation, little is known about the neural mechanisms for robust listening in reverberation and even less is known about the combined effects of noise and reverberation on neural coding. The proposed studies will, for the first time, investigate how speech-like stimuli presented in realistic reverberation are encoded in the auditory midbrain of an unanesthetized rabbit model. The auditory midbrain is known to play a key role in the precedence effect, which helps accurate sound localization in reverberation.
Three specific aims will use stimuli of increasing realism to address fundamental questions. Because modulations of the amplitude envelope are essential for speech reception, Specific Aim 1 will focus on amplitude modulated (AM) stimuli. We will test the hypothesis that a previously identified ?reverberant advantage? in the temporal coding of AM sounds occurs because the neurons sensitivity to AM is enhanced by their sensitivity to the fluctuations in interaural coherence that necessarily accompany AM in reverberation. If verified, this hypothesis would highlight the importance of preserving dynamic binaural cues in hearing aids.
Specific Aim 2 will directly test how natural speech is coded by midbrain auditory neurons in various amounts of reverberation. We will use modern techniques for stimulus reconstruction from neural population activity to quantitatively assess the robustness of neural coding in reverberation and test whether adaptation to the statistics of the reverberant environment contributes to robust coding.
Specific Aim 3 will extend the Aim 2 results to the common case when reverberation is combined with background noise. Using an innovative technique to independently control the amount of reverberation in a speech target and a noise interferer, we will disentangle how the neural coding of the target is affected by each. We will determine how the degradation in the interferer binaural cues caused by reverberation reduces the benefit of spatial separation between target and interferer for neural coding of the target. Together, these studies will increase fundamental understanding of the neural mechanisms for robust neural coding in reverberation. The findings will guide the development of new processing strategies for hearing aids and cochlear implants that would improve speech reception in everyday challenging environments.
Hearing impaired listeners struggle to understand speech in everyday reverberant environments, especially when reverberation is combined with noise, yet little is known about neural mechanisms that allow normal hearing listeners to understand speech in reverberation. The proposed studies will, for the first time, characterize how reverberation affects the neural coding of speech in the auditory midbrain, alone and combined with noise, in reverberant environments. The findings will be essential for improving the way sound is processed in reverberant environment for individuals with hearing loss.
Showing the most recent 10 out of 30 publications