The overall goal of this project is to understand how natural how natural acoustic environments that include reverberations and multiple sound sources are represented in the auditory midbrain. Such complex environments are frequently encountered in every- day situation, sand cause difficulties in speech reception for the hearing-impaired. We will use the powerful ~virtual-space~ stimulation technique that we developed in the preceding project period. This technique, which simulates natural acoustic environments with closed acoustic systems, achieves precise control over individual acoustic cues used for sound localization (intermural differences in time and level, and spectrum). This technique will be used to study neural mechanisms that allow listeners to better detect sound signals in noisy backgrounds when the signal and the noise are spatially separated (Aim 1). We will assess the roses of monaural and binaural factors in the ability of populations of inferior-colliculus neurons to detect the signal in noise, and identify which localization cues are most important for detection. We will also study neural correlates of the precedence effect, wherein listeners accurately localize sound sources in the presence of reflections from wall surfaces (Aim 2). We will test hypotheses as to how populations of inferior-colliculus neurons represent sound location by comparing estimates of sound location derived from neural responses with corresponding patterns of human localization judgements in the presence of reflections. To better understand the anatomical basis of localization mechanisms, we will correlate physiological properties of neurons important for spatial hearing to their location with respect to regions that show distinct cytochemical labels (Aim 3). This work addresses general questions about the neural basis of spatial hearing: how auditory space is represented in neural populations, how the auditory system uses differences in location to separate and recognize auditory objects, how it copes with reflections and competing sounds, and the extent to which these neural mechanisms specifically depend on directional information. In combination with research on the neural coding of speech, this work may clarify why hearing-impaired and elderly listeners have greater difficulties than normals in understanding speech in the presence of reverberation and competing sounds, and may help develop new kinds of hearing aids that would perform better in noisy and reverberant environments.

Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
Massachusetts Eye and Ear Infirmary
United States
Zip Code
Gutschalk, Alexander; Oxenham, Andrew J; Micheyl, Christophe et al. (2007) Human cortical activity during streaming without spectral cues suggests a general neural substrate for auditory stream segregation. J Neurosci 27:13074-81
Micheyl, Christophe; Carlyon, Robert P; Gutschalk, Alexander et al. (2007) The role of auditory cortex in the formation of auditory streams. Hear Res 229:116-31
Wilson, E Courtenay; Melcher, Jennifer R; Micheyl, Christophe et al. (2007) Cortical FMRI activation to sequences of tones alternating in frequency: relationship to perceived rate and streaming. J Neurophysiol 97:2230-8
Fullerton, Barbara C; Pandya, Deepak N (2007) Architectonic analysis of the auditory-related areas of the superior temporal region in human brain. J Comp Neurol 504:470-98
Sigalovsky, Irina S; Melcher, Jennifer R (2006) Effects of sound level on fMRI activation in human brainstem, thalamic and cortical centers. Hear Res 215:67-76
Sigalovsky, Irina S; Fischl, Bruce; Melcher, Jennifer R (2006) Mapping an intrinsic MR property of gray matter in auditory cortex of living humans: a possible marker for primary cortex and hemispheric differences. Neuroimage 32:1524-37
Hawley, Monica L; Melcher, Jennifer R; Fullerton, Barbara C (2005) Effects of sound bandwidth on fMRI activation in human auditory brainstem nuclei. Hear Res 204:101-10
Harms, Michael P; Guinan Jr, John J; Sigalovsky, Irina S et al. (2005) Short-term sound temporal envelope characteristics determine multisecond time patterns of activity in human auditory cortex as shown by fMRI. J Neurophysiol 93:210-22
Talavage, Thomas M; Edmister, Whitney B (2004) Nonlinearity of FMRI responses in human auditory cortex. Hum Brain Mapp 22:216-28
Penagos, Hector; Melcher, Jennifer R; Oxenham, Andrew J (2004) A neural representation of pitch salience in nonprimary human auditory cortex revealed with functional magnetic resonance imaging. J Neurosci 24:6810-5

Showing the most recent 10 out of 61 publications