The neural mechanisms of sensory conflict and plasticity are not well understood, yet common in everyday life. If sensory conflict is not ameliorated or reduced, problems to human health, such as motion sickness, can be harmful. Perceptual plasticity can ameliorate motion sickness by reducing sensory conflict. Here we hypothesize that multisensory plasticity enables our senses to dynamically adapt to each other and the external environment. Recent work using human/monkey psychophysics has distinguished two multisensory plasticity mechanisms, unsupervised and supervised. How and where multisensory plasticity takes place in the brain is unknown and this proposal explores specific hypotheses about its neural basis.
In aim 1, we will search for neural correlates of adult multisensory plasticity in single cortical neuron activity and population responses from the dorsal medial superior temporal area (MSTd) and ventral intraparietal area (VIP).
In aim 2, we will search for direct links between neuronal activity and multisensory plasticity using reversible chemical inactivation.
The aims outlined here test the hypothesis that unsupervised plasticity occurs in the relatively low- level multisensory cortical area MSTd, whereas supervised plasticity occurs in higher-level multisensory VIP, an area thought closer to where the perceptual decisions that guide behavior are formed. Results from aim 1 about neuronal tuning curve shifts, choice probabilities and noise correlations will be used to compute population thresholds as well as simulate specific cortical lesions. These model simulations will be directly compared with the inactivation experiments in aim 2, thus providing novel information about neural decoding. Results from these experiments are critical for understanding the neural basis of multisensory plasticity, a fundamental operation our brain performs throughout our lives. Our combined use of psychophysics, single cell and population responses, causal manipulations while monitoring perception, as well as computational modeling, represents a state-of-the-art approach and ensures substantial successes.

Public Health Relevance

Representations of a single stimulus via multiple sensory modalities normally combine to form a coherent, unified percept, but conflicts between them can cause physical discomfort. A disagreement between visually perceived self-movement and vestibular cues can lead to motion sickness, characterized by dizziness, fatigue, nausea, and vomiting, but neural correlates remain unknown. The experiments proposed here aim at filling a very notable gap in knowledge, important for understanding the plasticity of visual-vestibular interactions during sensory conflict.

National Institute of Health (NIH)
National Institute on Deafness and Other Communication Disorders (NIDCD)
Research Project (R01)
Project #
Application #
Study Section
Sensorimotor Integration Study Section (SMI)
Program Officer
Poremba, Amy
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
Baylor College of Medicine
Schools of Medicine
United States
Zip Code
Chen, Xiaodong; DeAngelis, Gregory C; Angelaki, Dora E (2018) Flexible egocentric and allocentric representations of heading signals in parietal cortex. Proc Natl Acad Sci U S A 115:E3305-E3312
Zaidel, Adam; DeAngelis, Gregory C; Angelaki, Dora E (2017) Decoupled choice-driven and stimulus-related activity in parietal neurons may be misrepresented by choice probabilities. Nat Commun 8:715
Sasaki, Ryo; Angelaki, Dora E; DeAngelis, Gregory C (2017) Dissociation of Self-Motion and Object Motion by Linear Population Decoding That Approximates Marginalization. J Neurosci 37:11204-11219