Localizing a sound in 3-dimensional space requires the processing of azimuth, elevation, and distance. Although considerable attention has been focused on the neural processing of azimuthal and elevational cues, our knowledge of the neural processing of distance is almost non-existent. Psychophysical and modeling studies show distance perception in reverberant environments is based on the ratio of reverberant to direct sound energy. This ratio changes with distance because the direct energy decreases at 6 dB per for doubling of distance, but reverberant energy remains about constant with distance. Another cue for distance are the increases in interaural level difference with decreasing distance. Another potential cue is the systematic decrease in high frequency energy with increasing distance. To investigate the neural processing of sound distance, we will use virtual sounds made from ear canal recordings to sounds emanating from different distances and azimuths that are made in our state-of-the-art anechoic and reverberant chambers. In human listeners, such virtual sounds produce auditory images that are appropriately externalized and localized. We will use this virtual technology because, in comparison to making neural recordings in the real sound field, it allows the dissection and modification of auditory cues and also offers far greater efficiency and flexibility in sampling spatial locations. It also permits switching back and forth from the anechoic to reverberant environment, while recording from a neuron. We will explore distance processing in both anechoic and reverberant environments because the acoustics are quite different and both of these environments, to one degree or another, are normally experienced by humans and other animals.
Our aim i s to determine the neural coding of sound source distance in anechoic and reverberant environment in neurons of the inferior colliculus in the unanesthetized rabbit. Specifically, we will examine the effect of neural interactions between distance and azimuth, distance and acoustic environment, and azimuth and acoustic environment. The results may be helpful in designing hearing aids, cochlear implants, and robotic devices to incorporate all 3 dimensions of spatial hearing. ? ? ?
Showing the most recent 10 out of 21 publications