The overall goal of this project is to understand how natural how natural acoustic environments that include reverberations and multiple sound sources are represented in the auditory midbrain. Such complex environments are frequently encountered in every- day situation, sand cause difficulties in speech reception for the hearing-impaired. We will use the powerful ~virtual-space~ stimulation technique that we developed in the preceding project period. This technique, which simulates natural acoustic environments with closed acoustic systems, achieves precise control over individual acoustic cues used for sound localization (intermural differences in time and level, and spectrum). This technique will be used to study neural mechanisms that allow listeners to better detect sound signals in noisy backgrounds when the signal and the noise are spatially separated (Aim 1). We will assess the roses of monaural and binaural factors in the ability of populations of inferior-colliculus neurons to detect the signal in noise, and identify which localization cues are most important for detection. We will also study neural correlates of the precedence effect, wherein listeners accurately localize sound sources in the presence of reflections from wall surfaces (Aim 2). We will test hypotheses as to how populations of inferior-colliculus neurons represent sound location by comparing estimates of sound location derived from neural responses with corresponding patterns of human localization judgements in the presence of reflections. To better understand the anatomical basis of localization mechanisms, we will correlate physiological properties of neurons important for spatial hearing to their location with respect to regions that show distinct cytochemical labels (Aim 3). This work addresses general questions about the neural basis of spatial hearing: how auditory space is represented in neural populations, how the auditory system uses differences in location to separate and recognize auditory objects, how it copes with reflections and competing sounds, and the extent to which these neural mechanisms specifically depend on directional information. In combination with research on the neural coding of speech, this work may clarify why hearing-impaired and elderly listeners have greater difficulties than normals in understanding speech in the presence of reverberation and competing sounds, and may help develop new kinds of hearing aids that would perform better in noisy and reverberant environments.
Showing the most recent 10 out of 61 publications