The pattern of image motion across the retina (optic flow) during self-motion provides a powerful cue to the heading direction of the observer. However, optic flow processing poses a number of computationally difficult problems, the most notable example being segregation of object motion from self-motion. Vestibular signals about the linear components of head movement can greatly simplify the extraction of behaviorally relevant information from optic flow. Using quantitative human psychophysics, here we test the hypothesis that vestibular information can be used by the brain to solve the flow parsing problem and distinguish object motion and self-motion. In addition, we explore whether visual and vestibular signals maintain internal calibration such as to ensure that estimates of the same stimulus by different sensors agree with one another. By exposing subjects to spatially conflicting optic flow and vestibular heading information, we test two models of internal consistency: a visual dominance model, which expects that, like the vestibulo-ocular reflex, vestibular perception always changes to become consistent with vision, and a reliability-based calibration model, which depends on cue reliability and assures minimum variance sensory estimates over time. According to the latter, vestibular perception should adapt towards the visually-specified heading when visual reliability is higher than the vestibular reliability and visual perception should adapt towards the vestibular-specified heading when visual reliability is lower than the vestibular reliability. We will also examine the effects of external feedback and how subjects weigh internal consistency versus external accuracy. A 6-degree-of-freedom motion platform with attached large field-of-view stereo projection system and two-alternative-forced-choice methodology will be used for all experiments. Findings are important for understanding basic perceptual visual/vestibular interactions using quantitative methodology and opening new directions in the fields of basic and clinical spatial orientation psychophysics.

Public Health Relevance

Understanding multisensory integration and self-motion perception would promote new strategies for treating spatial disorientation deficits common to many brain dysfunctions, including Alzheimer's disease. One of these deficits is an impaired ability to judge heading from optic flow, and this impairment is correlated with patients'difficulty in navigating through their surroundings. Better localization of these functions would help targeting new Alzheimer's therapies. In addition, neurological correlates of otolith disorders are still a mystery, posing a major hurdle in defining effective therapeutic strategies. Understanding the properties of otolith-mediated self- motion perception are important for understanding and treating basic postural and spatial orientation deficits. Our experiments aim at filling a very notable gap in knowledge, important for understanding and ultimately treating basic and cognitive deficits of spatial perception.

Agency
National Institute of Health (NIH)
Institute
National Institute on Deafness and Other Communication Disorders (NIDCD)
Type
Research Project (R01)
Project #
5R01DC007620-07
Application #
8403490
Study Section
Central Visual Processing Study Section (CVP)
Program Officer
Platt, Christopher
Project Start
2005-07-01
Project End
2016-11-30
Budget Start
2012-12-01
Budget End
2013-11-30
Support Year
7
Fiscal Year
2013
Total Cost
$371,688
Indirect Cost
$134,188
Name
Baylor College of Medicine
Department
Neurosciences
Type
Schools of Medicine
DUNS #
051113330
City
Houston
State
TX
Country
United States
Zip Code
77030
Drugowitsch, Jan; DeAngelis, Gregory C; Angelaki, Dora E et al. (2015) Tuning the speed-accuracy trade-off to maximize reward rate in multisensory decision-making. Elife 4:e06678
Dokka, Kalpana; MacNeilage, Paul R; DeAngelis, Gregory C et al. (2015) Multisensory self-motion compensation during object trajectory judgments. Cereb Cortex 25:619-30
Dokka, Kalpana; DeAngelis, Gregory C; Angelaki, Dora E (2015) Multisensory Integration of Visual and Vestibular Signals Improves Heading Discrimination in the Presence of a Moving Object. J Neurosci 35:13599-607
Angelaki, Dora E (2014) How Optic Flow and Inertial Cues Improve Motion Perception. Cold Spring Harb Symp Quant Biol 79:141-8
Drugowitsch, Jan; DeAngelis, Gregory C; Klier, Eliana M et al. (2014) Optimal multisensory decision-making in a reaction-time task. Elife 3:
Zaidel, Adam; Ma, Wei Ji; Angelaki, Dora E (2013) Supervised calibration relies on the multisensory percept. Neuron 80:1544-57
MacNeilage, Paul R; Zhang, Zhou; DeAngelis, Gregory C et al. (2012) Vestibular facilitation of optic flow parsing. PLoS One 7:e40264
Zaidel, Adam; Turner, Amanda H; Angelaki, Dora E (2011) Multisensory calibration is independent of cue reliability. J Neurosci 31:13949-62
Dokka, Kalpana; MacNeilage, Paul R; DeAngelis, Gregory C et al. (2011) Estimating distance during self-motion: a role for visual-vestibular interactions. J Vis 11:
Gu, Yong; Fetsch, Christopher R; Adeyemo, Babatunde et al. (2010) Decoding of MSTd population activity accounts for variations in the precision of heading perception. Neuron 66:596-609

Showing the most recent 10 out of 18 publications