The goal of this project is to extend our knowledge of the neural mechanisms that underlie the integration of information from different sensory modalities and their transformation into motor commands to generate gaze shifts. Specifically, we will seek to establish the relationship between the magnitude and timing of sensory responses, associated motor discharges, and resulting gaze shifts. The proposed experiments, to be carried out in a newly developed head-unrestrained monkey preparation, are a natural continuation of those carried out in the behaving, head-restrained cat during the previous funding period. The change from the cat to the monkey preparation was dictated by the questions that arose from our previous work, which cannot be adequately addressed in the cat. The experimental approach will be to record from single units in the intermediate layers of the superior colliculus (SCi) of monkeys (1) during the presentation of acoustic, visual, and bimodal stimuli, and (2) while they orient to the sources of those stimuli. The proposed work comprises four specific aims:
AIM 1 will test the hypothesis that facilitation of behavioral responses to bimodal stimuli in primates is mediated by integrative mechanisms that obey linear (or sub-additive), not super-additive processes.
AIM 2 will test the alternative hypothesis that enhanced bimodal responses are observed in single SC units, but under conditions of high behavioral significance to the subject.
AIM 3 will test the hypothesis that the SC operates in two distinct modes: a detection mode, characterized by large neuronal responses that increase the likelihood of orienting movements directed to the source of the stimuli, and an attention-driven mode, characterized by smaller, transient neuronal responses that provide for fine behavioral control, e.g., stopping the response if changing conditions so require.
AIM 4 will test the motor error hypothesis (Sparks, 1986) in the head-unrestrained monkey. This project is concerned with two of the most fundamental questions facing systems neuroscience: how is information from different sensory modalities integrated into a unique and single representation of the surrounding space, and how is such information used to generate and control movement specifically, gaze shifts (coordinated movements of the eyes and head). Furthering our understanding of how nature has implemented neural solutions to these basic questions will result in the design of better computer-brain interface devices, prostheses, smart robots, automatic system recognition devices, and ultimately help neurology/otolaryngology in devising electronic and pharmacological solutions for patients affects with sensory and sensorimotor integration diseases.
This project is concerned with two of the most fundamental questions facing systems neuroscience: how is information from different sensory modalities integrated into a unique and single representation of the surrounding space, and how is such information used to generate and control movement specifically, gaze shifts (coordinated movements of the eyes and head). Furthering our understanding of how nature has implemented neural solutions to these basic questions will result in the design of better computer-brain interface devices, prostheses, smart robots, automatic system recognition devices, and ultimately help neurology/otolaryngology in devising electronic and pharmacological solutions for patients affects with sensory and sensorimotor integration diseases.
Showing the most recent 10 out of 13 publications