Perceiving the motion of objects in our environment is critical to many aspects of daily life. Although much is known about the neural basis of visual motion perception, almost all of that knowledge is derived from experiments in which the observer's eyes, head, and body remain stationary. When an observer moves through the environment, the motion of an object on the retina depends on both the movement of the object in the world and the self-motion of the observer. Thus, to estimate the motion of objects in the world, self-motion must be factored into the computations. This capability is critical to daily activities such as driving a vehicle, during which it is important to accurately judge the movement of other objects in the environment, such as other vehicles or pedestrians. As perception of self-motion is known to be degraded in neurological disorders such as Alzheimer's disease and during normal aging, understanding the neural basis of object motion perception during self-motion is of substantial importance to developing therapies for these deficits. We propose a series of experiments that provides the first systematic exploration of the neural mechanisms of object motion perception during self- motion.
Aim #1 explores visual mechanisms by which optic flow may be parsed into components related to self-motion and object motion. Behavioral studies have demonstrated perceptual consequences of flow parsing, but the neural basis is unknown. We test the hypothesis that responses of neurons in retinotopic motion areas mediate the perceptual effects of flow parsing Aim #2 employs a novel behavioral paradigm in which macaques are trained to judge object motion during self-motion, and to report their percepts in either world-centered or head-centered coordinates. We hypothesize that both visual and vestibular self-motion signals make important contributions to perception of object motion. By recording from neurons in multiple brain areas during performance of this task, we seek to identify neural correlates of the computations that are used to flexibly represent object motion in either world-centered or head- centered coordinates. The proposed research is directly relevant to the research priorities of the Strabismus, Amplyopia, and Visual Processing program at the National Eye Institute.

Public Health Relevance

We propose a series of experiments to unravel the neural basis of perception of object motion during self-motion. Deficits in perceiving self-motion are among the earliest consequences of Alzheimer's disease, and associated deficiencies in judging object motion may substantially impact patients' lives (such as when driving a vehicle). Little is known about how the brain computes object motion during self-motion, and the knowledge gained will inform treatment of disease and development of visual prosthetics in active observers.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY016178-14
Application #
9608015
Study Section
Mechanisms of Sensory, Perceptual, and Cognitive Processes Study Section (SPC)
Program Officer
Flanders, Martha C
Project Start
2004-12-01
Project End
2020-11-30
Budget Start
2018-12-01
Budget End
2019-11-30
Support Year
14
Fiscal Year
2019
Total Cost
Indirect Cost
Name
University of Rochester
Department
Ophthalmology
Type
Schools of Arts and Sciences
DUNS #
041294109
City
Rochester
State
NY
Country
United States
Zip Code
14627
Shao, Mengmeng; DeAngelis, Gregory C; Angelaki, Dora E et al. (2018) Clustering of heading selectivity and perception-related activity in the ventral intraparietal area. J Neurophysiol 119:1113-1126
Chen, Xiaodong; DeAngelis, Gregory C; Angelaki, Dora E (2018) Flexible egocentric and allocentric representations of heading signals in parietal cortex. Proc Natl Acad Sci U S A 115:E3305-E3312
Lakshminarasimhan, Kaushik J; Petsalis, Marina; Park, Hyeshin et al. (2018) A Dynamic Bayesian Observer Model Reveals Origins of Bias in Visual Path Integration. Neuron 99:194-206.e5
Laurens, Jean; Liu, Sheng; Yu, Xiong-Jie et al. (2017) Transformation of spatiotemporal dynamics in the macaque vestibular system from otolith afferents to cortex. Elife 6:
Sasaki, Ryo; Angelaki, Dora E; DeAngelis, Gregory C (2017) Dissociation of Self-Motion and Object Motion by Linear Population Decoding That Approximates Marginalization. J Neurosci 37:11204-11219
Ohshiro, Tomokazu; Angelaki, Dora E; DeAngelis, Gregory C (2017) A Neural Signature of Divisive Normalization at the Level of Multisensory Integration in Primate Cortex. Neuron 95:399-411.e8
Chen, Aihua; Gu, Yong; Liu, Sheng et al. (2016) Evidence for a Causal Contribution of Macaque Vestibular, But Not Intraparietal, Cortex to Heading Perception. J Neurosci 36:3789-98
Sunkara, Adhira; DeAngelis, Gregory C; Angelaki, Dora E (2016) Joint representation of translational and rotational components of optic flow in parietal cortex. Proc Natl Acad Sci U S A 113:5077-82
Kim, HyungGoo R; Pitkow, Xaq; Angelaki, Dora E et al. (2016) A simple approach to ignoring irrelevant variables by population decoding based on multisensory neurons. J Neurophysiol 116:1449-67
Gu, Yong; Cheng, Zhixian; Yang, Lihua et al. (2016) Multisensory Convergence of Visual and Vestibular Heading Cues in the Pursuit Area of the Frontal Eye Field. Cereb Cortex 26:3785-801

Showing the most recent 10 out of 49 publications