An understanding of how the brain integrates separate acoustic cues for comprehensive auditory perception is essential in explaining how complex sounds, even as intricate as spoken language, are interpreted and understood. Clearly, identifying the acoustic cues that the brain extracts from complex sounds is a necessary step in understanding the aspects of auditory information that are neurally coded. Currently, the echolocating bat is the only mammal that provides a physiological model for complex-sound analysis of identified stimulus features at higher levels of the auditory pathway. The bat's auditory system is uniquely designed for processing its own echolocation signals, and since the type of information borne in the sound components is known, this animal is ideal for the study of neural mechanisms underlying target-feature extraction. Bats employing frequency-modulated (FM) sounds for echolocation exhibit exquisite acuity for detecting differences in echo delay. this acoustic cue conveys information for target distance and structure. Results of recent behavioral studies further indicate that information from spectral cues is also processed with temporal information for perception of a target as a coherent image. Our neurophysiological experiments in the FM bat, Myotis lucifugus, reveal overlapping tonotopic and delay-sensitive zones in the auditory cortex. In this overlapping zone, neurons exhibit dual sensitivity to single sounds of specific frequencies and to sound pairs separated by a specific echo delay. The goal of the proposed study is to examine neurophysiologically how cortical neurons encode different acoustic parameters with focus upon integrative coding strategies for unified auditory perception. Extracellular unit-recordings will be obtained from delay-sensitive neurons, mainly in the overlapping zone of the Myotis auditory cortex. Specifically, this research proposes to: (1) characterize neurons that monitor target distance at sound presentations which stimulate the changing rates of sound emissions during echolocation (tracking neurons); (2) characterize neurons that are sensitive to rate at which echo delay changes, and hence, neurally compute target-velocity; (3) determine the essential spectral components of an echo for neurons coding for target distance and structure. The combined results will clarify our understanding of multidimensional processing by a class of highly selective neurons within a single cortical zone. This is fundamental for establishing what role shifts in spectra and timing play in the comprehension of complex sounds such as human speech.