In real world situations, objects and event comprise multiple stimulus attributes. They emit or reflect light, make sounds, and have characteristic mass and tactile properties. While the nervous system initially processes each sensory modality independently by transducing a restricted portion of energy by specialized receptors, this information is ultimately combined at higher stations of the nervous system to generate unified percepts. Even though this is a fundamental process of sensory systems, very little is understood how information from different sensory systems is combined, and how this integrations results in the perception of single objects and events. One way in which the processes of multi-sensory integration can be explored is to compare the responses of single neurons to the perceptions generated by combined sensory stimuli. By comparing the neuronal responses between the physical stimuli and the resulting perceptions, one can begin to elucidate the underlying neuronal mechanisms of that perception. One can further test these hypotheses by taking advantage of stimuli that generate faulty perceptions, i.e. illusions. The goals of this research project are to explore how single neurons in the cerebral cortex combine visual and auditory information. Specifically, we wish to determine if multi-sensory stimuli are processed in regions of the cerebral cortex that are traditionally considered to be 'unimodal', or if multi-sensory integration does not occur until the traditionally considered 'multi-modal'areas. We will use combined auditory and visual stimuli that generate perceptual illusions in both the temporal and spatial domain to explore where neurons respond best to the physical characteristics of the stimuli (such as in the sensory receptors) and where neurons respond best to the resulting perceptions that we all experience. By tracking these responses throughout the cortical hierarchy we will begin to understand how and where these integrative processes take place. These insights will provide a cornerstone for understanding complex perceptions and cognitive ability.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY013458-09
Application #
7633239
Study Section
Special Emphasis Panel (ZRG1-IFCN-E (02))
Program Officer
Steinmetz, Michael A
Project Start
2001-02-01
Project End
2011-06-30
Budget Start
2009-07-01
Budget End
2011-06-30
Support Year
9
Fiscal Year
2009
Total Cost
$254,275
Indirect Cost
Name
University of California Davis
Department
Neurosciences
Type
Schools of Arts and Sciences
DUNS #
047120084
City
Davis
State
CA
Country
United States
Zip Code
95618
Padberg, Jeffrey; Recanzone, Gregg; Engle, James et al. (2010) Lesions in posterior parietal area 5 in monkeys result in rapid behavioral and cortical plasticity. J Neurosci 30:12918-35
Padberg, Jeffrey; Cerkevich, Christina; Engle, James et al. (2009) Thalamocortical connections of parietal somatosensory cortical fields in macaque monkeys are highly divergent and convergent. Cereb Cortex 19:2038-64
Recanzone, Gregg H (2009) Interactions of auditory and visual stimuli in space and time. Hear Res 258:89-99
Recanzone, Gregg H; Sutter, Mitchell L (2008) The biological basis of audition. Annu Rev Psychol 59:119-42
Krubitzer, Leah; Huffman, Kelly J; Disbrow, Elizabeth et al. (2004) Organization of area 3a in macaque monkeys: contributions to the cortical phenotype. J Comp Neurol 471:97-111
Woods, Timothy M; Recanzone, Gregg H (2004) Visually induced plasticity of auditory spatial perception in macaques. Curr Biol 14:1559-64
Recanzone, Gregg H; Beckerman, Nathan S (2004) Effects of intensity and location on sound location discrimination in macaque monkeys. Hear Res 198:116-24
Recanzone, Gregg H (2003) Auditory influences on visual temporal rate perception. J Neurophysiol 89:1078-93
Recanzone, G H (2001) Spatial processing in the primate auditory cortex. Audiol Neurootol 6:178-81