Integrating sensory information from a variety of sources to produce motor commands is fundamental to human behavior. Impairments in multisensory and sensorimotor integration impact numerous aspects of human health, ranging from social interactions and communication to movements in a complicated environment. For example, directing gaze to the location of a sound is a complex information processing task requiring the conversion of auditory input signals into motor commands to move the eyes. This process is impaired in human neglect patients. Here, we propose a joint computational and experimental approach to illuminate this problem. Specifically, we will investigate how information about sound location is encoded in the spike trains of neurons in primate superior colliculus (SC), in comparison to structures that provide inputs to or receive an output from the SC, such as auditory cortex, intraparietal cortex, and the paramedian reticular formation. We will explore the reference frame (Aim 1) and coding format (Aim 2) of the representation of stimulus location. Of particular interest will be the relationship between visual and auditory representations: we will evaluate quantitatively the similarity between these representations.
In Aim 3, we will consider how these signals are """"""""read out"""""""" into commands to move the eyes. This theoretical and experimental aim will test algorithms that could produce accurate motor commands to both visual and auditory targets despite differences in how these signals are encoded at earlier stages.
These aims will enhance our understanding of neural processing from sensory input to motor output. The issues of multisensory and sensorimotor integration investigated here bear on a variety of neurological disorders such as those arising from stroke and other types of brain lesions. A better understanding of the transformation from sensory input to motor response will aid in identifying the pathophysiological substrates in neurological disorders with impaired sensorimotor integration.

Public Health Relevance

Problems involving multisensory and sensorimotor integration are implicated in a variety of neurological disorders including stroke, neglect, Parkinson's disease, and autism. A better understanding of how sensory inputs of different modalities are transformed to produce motor commands will help us identify the pathophysiological substrates of these diseases. Ultimately, new therapies in which information from one sensory modality is used to compensate for deficiencies in another sensory modality may emerge from this research.

Agency
National Institute of Health (NIH)
Institute
National Institute of Neurological Disorders and Stroke (NINDS)
Type
Research Project (R01)
Project #
2R01NS050942-05A2
Application #
7750431
Study Section
Special Emphasis Panel (ZRG1-IFCN-A (04))
Program Officer
Gnadt, James W
Project Start
2004-09-01
Project End
2014-06-30
Budget Start
2009-07-01
Budget End
2010-06-30
Support Year
5
Fiscal Year
2009
Total Cost
$341,250
Indirect Cost
Name
Duke University
Department
Other Basic Sciences
Type
Schools of Arts and Sciences
DUNS #
044387793
City
Durham
State
NC
Country
United States
Zip Code
27705
Caruso, Valeria C; Pages, Daniel S; Sommer, Marc A et al. (2016) Similar prevalence and magnitude of auditory-evoked and visually evoked activity in the frontal eye fields: implications for multisensory motor control. J Neurophysiol 115:3162-73
Lee, Jungah; Groh, Jennifer M (2014) Different stimuli, different spatial codes: a visual map and an auditory rate code for oculomotor space in the primate superior colliculus. PLoS One 9:e85017
Bulkin, David A; Groh, Jennifer M (2012) Distribution of visual and saccade related information in the monkey inferior colliculus. Front Neural Circuits 6:61
Gruters, Kurtis G; Groh, Jennifer M (2012) Sounds and beyond: multisensory and other non-auditory signals in the inferior colliculus. Front Neural Circuits 6:96
Lee, Jungah; Groh, Jennifer M (2012) Auditory signals evolve from hybrid- to eye-centered coordinates in the primate superior colliculus. J Neurophysiol 108:227-42
Bulkin, David A; Groh, Jennifer M (2012) Distribution of eye position information in the monkey inferior colliculus. J Neurophysiol 107:785-95
Wei, Qi; Sueda, Shinjiro; Pai, Dinesh K (2010) Physically-based modeling and simulation of extraocular muscles. Prog Biophys Mol Biol 103:273-83
Maier, Joost X; Groh, Jennifer M (2009) Multisensory guidance of orienting behavior. Hear Res 258:106-12
Mullette-Gillman, O'Dhaniel A; Cohen, Yale E; Groh, Jennifer M (2009) Motor-related signals in the intraparietal cortex encode locations in a hybrid, rather than eye-centered reference frame. Cereb Cortex 19:1761-75
Kopco, Norbert; Lin, I-Fan; Shinn-Cunningham, Barbara G et al. (2009) Reference frame of the ventriloquism aftereffect. J Neurosci 29:13809-14

Showing the most recent 10 out of 17 publications