The pattern of image motion across the retina (optic flow) during self-motion provides a powerful cue to the heading direction of the observer. However, optic flow processing poses a number of computationally difficult problems, the most notable example being segregation of object motion from self-motion. Vestibular signals about the linear components of head movement can greatly simplify the extraction of behaviorally relevant information from optic flow. Using quantitative human psychophysics, here we test the hypothesis that vestibular information can be used by the brain to solve the flow parsing problem and distinguish object motion and self-motion. In addition, we explore whether visual and vestibular signals maintain internal calibration such as to ensure that estimates of the same stimulus by different sensors agree with one another. By exposing subjects to spatially conflicting optic flow and vestibular heading information, we test two models of internal consistency: a visual dominance model, which expects that, like the vestibulo-ocular reflex, vestibular perception always changes to become consistent with vision, and a reliability-based calibration model, which depends on cue reliability and assures minimum variance sensory estimates over time. According to the latter, vestibular perception should adapt towards the visually-specified heading when visual reliability is higher than the vestibular reliability and visual perception should adapt towards the vestibular-specified heading when visual reliability is lower than the vestibular reliability. We will also examine the effects of external feedback and how subjects weigh internal consistency versus external accuracy. A 6-degree-of-freedom motion platform with attached large field-of-view stereo projection system and two-alternative-forced-choice methodology will be used for all experiments. Findings are important for understanding basic perceptual visual/vestibular interactions using quantitative methodology and opening new directions in the fields of basic and clinical spatial orientation psychophysics.

Public Health Relevance

Understanding multisensory integration and self-motion perception would promote new strategies for treating spatial disorientation deficits common to many brain dysfunctions, including Alzheimer's disease. One of these deficits is an impaired ability to judge heading from optic flow, and this impairment is correlated with patients'difficulty in navigating through their surroundings. Better localization of these functions would help targeting new Alzheimer's therapies. In addition, neurological correlates of otolith disorders are still a mystery, posing a major hurdle in defining effective therapeutic strategies. Understanding the properties of otolith-mediated self- motion perception are important for understanding and treating basic postural and spatial orientation deficits. Our experiments aim at filling a very notable gap in knowledge, important for understanding and ultimately treating basic and cognitive deficits of spatial perception.

Agency
National Institute of Health (NIH)
Institute
National Institute on Deafness and Other Communication Disorders (NIDCD)
Type
Research Project (R01)
Project #
5R01DC007620-08
Application #
8576392
Study Section
Central Visual Processing Study Section (CVP)
Program Officer
Platt, Christopher
Project Start
2005-07-01
Project End
2016-11-30
Budget Start
2013-12-01
Budget End
2014-11-30
Support Year
8
Fiscal Year
2014
Total Cost
$352,125
Indirect Cost
$127,125
Name
Baylor College of Medicine
Department
Neurosciences
Type
Schools of Medicine
DUNS #
051113330
City
Houston
State
TX
Country
United States
Zip Code
77030
Dokka, Kalpana; MacNeilage, Paul R; DeAngelis, Gregory C et al. (2015) Multisensory self-motion compensation during object trajectory judgments. Cereb Cortex 25:619-30
Drugowitsch, Jan; DeAngelis, Gregory C; Klier, Eliana M et al. (2014) Optimal multisensory decision-making in a reaction-time task. Elife 3:
Zaidel, Adam; Ma, Wei Ji; Angelaki, Dora E (2013) Supervised calibration relies on the multisensory percept. Neuron 80:1544-57
Zaidel, Adam; Turner, Amanda H; Angelaki, Dora E (2011) Multisensory calibration is independent of cue reliability. J Neurosci 31:13949-62
Dokka, Kalpana; MacNeilage, Paul R; DeAngelis, Gregory C et al. (2011) Estimating distance during self-motion: a role for visual-vestibular interactions. J Vis 11:
Gu, Yong; Fetsch, Christopher R; Adeyemo, Babatunde et al. (2010) Decoding of MSTd population activity accounts for variations in the precision of heading perception. Neuron 66:596-609
Fetsch, Christopher R; Deangelis, Gregory C; Angelaki, Dora E (2010) Visual-vestibular cue integration for heading perception: applications of optimal cue integration theory. Eur J Neurosci 31:1721-9
MacNeilage, Paul R; Turner, Amanda H; Angelaki, Dora E (2010) Canal-otolith interactions and detection thresholds of linear and angular components during curved-path self-motion. J Neurophysiol 104:765-73
MacNeilage, Paul R; Banks, Martin S; DeAngelis, Gregory C et al. (2010) Vestibular heading discrimination and sensitivity to linear acceleration in head and world coordinates. J Neurosci 30:9084-94
Angelaki, Dora E; Klier, Eliana M; Snyder, Lawrence H (2009) A vestibular sensation: probabilistic approaches to spatial perception. Neuron 64:448-61

Showing the most recent 10 out of 14 publications