What we see helps us interpret what we hear. Integrating visual and auditory information is a fundamental aspect of human behavior. Impairments in visual-auditory integration impact numerous aspects of human health, ranging from communication and language to attention and movements in a complicated environment. However, the neural basis of this ability is not well understood. Previous work in barn owls has focused on how the visual information is used to calibrate the representation of auditory space, and has implicated the inferior colliculus in this process. However, little is known about this area in primates. Rhesus monkeys have visual and auditory systems that are quite similar to humans, and serve as an excellent animal model for studying the neural mechanisms underlying visual-auditory integration. Visual-auditory integration in humans and monkeys is different from barn owls because primates can move their eyes whereas barn owls cannot. Thus, primate visual auditory integration requires three signals: visual, auditory, and eye position signals. It has long been known that the primate IC contains auditory information, and we have recently shown that it contains eye position information (Groh et al., Neuron, 2001). Here, we turn to visual information in this structure. We have recently discovered robust visual responses in the IC of awake monkeys. In this proposal, we seek to study the properties of these visual responses and how they relate to auditory and eye position information in this brain region. Specifically, we aim to (1) characterize the visual response properties (Aim 1), (2) determine the frame of reference of these visual responses (Aim 2), and determine how IC neurons respond to combined visual and auditory stimuli, especially when a spatial disparity between these stimuli produces a ventriloquism affect/aftereffect (Aim 3). The mechanisms of visual influence over auditory processing investigated here bear on a variety of disorders involving both visual and auditory components, such as dyslexia and autism. These experiments will yield a greater understanding of subcortical mechanisms for visual-auditory integration within the auditory pathway and may ultimately lead to treatment regimens to help ameliorate the deficits experienced by patients with these disorders. ? ?

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
7R01EY016478-02
Application #
7285121
Study Section
Cognitive Neuroscience Study Section (COG)
Program Officer
Oberdorfer, Michael
Project Start
2006-01-01
Project End
2010-12-31
Budget Start
2006-09-01
Budget End
2006-12-31
Support Year
2
Fiscal Year
2006
Total Cost
$143,373
Indirect Cost
Name
Duke University
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
044387793
City
Durham
State
NC
Country
United States
Zip Code
27705
Pages, Daniel S; Ross, Deborah A; Puñal, Vanessa M et al. (2016) Effects of Electrical Stimulation in the Inferior Colliculus on Frequency Discrimination by Rhesus Monkeys and Implications for the Auditory Midbrain Implant. J Neurosci 36:5071-83
Pages, Daniel S; Groh, Jennifer M (2013) Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization. PLoS One 8:e72562
Bulkin, David A; Groh, Jennifer M (2012) Distribution of visual and saccade related information in the monkey inferior colliculus. Front Neural Circuits 6:61
Bulkin, David A; Groh, Jennifer M (2012) Distribution of eye position information in the monkey inferior colliculus. J Neurophysiol 107:785-95
Bulkin, David A; Groh, Jennifer M (2011) Systematic mapping of the monkey inferior colliculus reveals enhanced low frequency sound representation. J Neurophysiol 105:1785-97
Maier, Joost X; Groh, Jennifer M (2009) Multisensory guidance of orienting behavior. Hear Res 258:106-12
Mullette-Gillman, O'Dhaniel A; Cohen, Yale E; Groh, Jennifer M (2009) Motor-related signals in the intraparietal cortex encode locations in a hybrid, rather than eye-centered reference frame. Cereb Cortex 19:1761-75
Kopco, Norbert; Lin, I-Fan; Shinn-Cunningham, Barbara G et al. (2009) Reference frame of the ventriloquism aftereffect. J Neurosci 29:13809-14
Werner-Reiss, Uri; Groh, Jennifer M (2008) A rate code for sound azimuth in monkey auditory cortex: implications for human neuroimaging studies. J Neurosci 28:3747-58
Porter, Kristin Kelly; Metzger, Ryan R; Groh, Jennifer M (2007) Visual- and saccade-related signals in the primate inferior colliculus. Proc Natl Acad Sci U S A 104:17855-60