What we see helps us understand what we hear. Knowing the locations of objects seems effortless, regardless of whether we see the object, hear it, or both. Each sensory system determines stimulus location differently. For example, the retina provides the brain with a snapshot-like map of where objects are located with respect to where the eyes are looking. This is an eye-centered frame of reference. In contrast, the perception of sound location involves assessing subtle differences in sound arrival time and volume between the two ears. These differences allow our brains to determine where sounds are located in space with respect to the head and ears - a head-centered frame of reference. This project investigates the neural mechanisms that underlie the communication and coordination between our visual and auditory systems. In particular, the experiments focus on the spatial information gleaned from our eyes and ears. When an object can be both seen and heard, the fact that the eyes can move poses a major problem for relating the visual and auditory components of the object. An object making a sound might lie to the left of your nose, but if your eyes are gazing even farther to the left, the visual image of that object will land on the right side of the retina's map of space. To coordinate and integrate the information gathered by our eyes and ears, the brain must use information about the angle of gaze to translate visual and auditory information into a common frame of reference. How does the brain do this? Previous work has shown that the auditory pathway possesses considerable information about angle of gaze. The specific features of the eye position influence are poorly understood and will be investigated in this project. Specifically, this project will address the role of this eye position signal by recording electroical activity in neurons in the auditory pathways of awake monkeys trained to perform behavioral tasks. The relationship between the brain's eye position signal and the ability to look to the location of a sound will be investigated and the computational features of the eye position signal and the nature of the interaction between eye position and neural responses to sounds will be determined. The results of these experiments will be used to build a detailed model of how visual and auditory information are combined in the brain of an animal whose visual, auditory and oculomotor capacities are similar to those of humans. In tandem with these research objectives this project will foster advanced computer training of graduate and undergraduate students with psychology backgrounds, bring undergraduate and graduate students into laboratory research and disseminate scientific advances to the public by writing articles on these and other scientific findings for newspapers and magazines that reach a broad non-scientific audience.