The nervous system can sense the world through a variety of sensory modalities providing complementary as well as redundant information about various aspects of the real world. Through a process known as multisensory integration, those sensory signals are merged and used to perform a variety of tasks such as locating objects in space or identifying the words uttered by a speaker. With funding from the National Science Foundation, Dr. Alexandre Pouget is conducting a strongly theory-driven research program combining modeling, human psychophysics and non-human primate neurophysiologal studies to investigate the neural basis of the process known as multisensory integration. Two factors make multisensory integration difficult: First, the sensory modalities are often in different formats (e.g., the sound and image of the same object are not directly comparable); and Second, the sensory modalities are not equally reliable (e.g., it is typically much easier to tell which word is uttered by a speaker based on sound than on lip movements). Using the framework of Bayesian inference, the research plan is to first develop a neural theory of multisensory integration that can solve both problems optimally. Then the plan is to perform recording in multisensory areas of awake monkeys to test the validity of the approach. While intuition would suggest that a neuron should respond to the same location in space regardless of the modality, a preliminary model suggests otherwise. In the computational network, the visual and tactile receptive fields of a given neuron do not occupy the exact same location. Experiments will involve recordings from cortical visuo-tactile neurons to test whether their receptive fields behave as predicted by simulations. The temporal aspect of the theory will also be tested through psychophysics experiments in humans. Subjects will be asked to perform sequences of eye movements in the presence of artificial motor error. The theory of optimal integration predicts that subjects will attempt to correct for their errors in proportion to the reliability of the visual feedback.

This research will involve postdocs, undergraduate and graduate students, and the results will be presented at major multidisciplinary conferences. The results will also have implications well beyond multisensory integration. Indeed, multisensory integration is a subcase of the general problem of Bayesian inference, which is believed to be at the heart of numerous cognitive processes such as object recognition, visual perception, motor control and abstract reasoning.

Agency
National Science Foundation (NSF)
Institute
Division of Behavioral and Cognitive Sciences (BCS)
Application #
0446730
Program Officer
Lynne Bernstein
Project Start
Project End
Budget Start
2005-08-01
Budget End
2009-07-31
Support Year
Fiscal Year
2004
Total Cost
$328,660
Indirect Cost
Name
University of Rochester
Department
Type
DUNS #
City
Rochester
State
NY
Country
United States
Zip Code
14627