The ability to learn from experience is perhaps the most fundamental feature of higher brain function. Even simple behaviors such as goal-directed reaching exhibit rapid and robust adaptation in response to changes in sensory feedback. These forms of learning have been extensively characterized at the behavioral level, and a variety of models have been developed to provide an intuitive understanding of these phenomena. Yet despite this progress, very little is known about how the underlying neural circuits change with learning or how sensory feedback drives these changes. This proposal addresses these questions, focusing on the rapid learning that occurs in response to shifted visual feedback of the arm (""""""""visual-shift adaptation""""""""). It has been shown that when visual feedback of the arm is displaced from the true position, for example with displacing prisms, compensatory shifts are observed in visual localization (where things """"""""look"""""""" to be) and proprioceptive localization (where the arm """"""""feels"""""""" to be). These shifts bring the two senses back into alignment. In order to uncover the physiological mechanism behind this process, this work will investigate how vision and proprioception are normally integrated in the brain (""""""""sensory integration"""""""") and how that process changes with visual-shift adaptation (""""""""sensory recalibration""""""""). This will be accomplished by recording neural activity in several arm-movement related areas in cerebral cortex as animals make sensory guided reaching movements with shifted or unshifted visual feedback of the arm. The activity of large neuronal populations will be simultaneously recorded, permitting direct comparison to existing neural models of sensory integration and recalibration.
Aim 1 is to study the cortical mechanism of sensory integration. Quantitative measurements will be made of the visual and proprioceptive contributions to the neural computations that underlie reach planning. Specifically, the relative weighting of visual feedback is inferred from the effect that various visual feedback shifts, included on randomly interleaved trials, have on the population activity. These experiments are designed i) to test whether cortical areas weight sensory inputs, ii) to identify which cortical areas change their weighting with behavior, and iii) to test whether this weighting is consistent with predictions from statistical theory.
Aim 2 is to study the cortical mechanism of sensory recalibration. Quantitative measurements will be made of the changes that occur to visual and proprioceptive signals in cortex during extended exposure to a constant visual shift, a situation that drives sensory recalibration. These experiments are designed to determine i) whether the changes in the sensory coding in a given cortical area can be explained by the misalignment of sensory inputs to that area and ii) whether these changes are consistent with predictions from statistical theory.
This project is aimed at discovering new mechanisms of learning in the neural circuits for eye-hand coordination in the cerebral cortex. This work will give us a deeper understanding of how sensory of our movements drives rapid changes in our brains. In addition to its scientific impact, this work has two potential medical applications: i) to aid in the development of sensory prosthetic devices for motor control, and ii) to aid in the development of new, principled therapies for sensory and sensory-motor deficits following stroke.
|Yazdan-Shahmorad, Azadeh; Diaz-Botia, Camilo; Hanson, Timothy L et al. (2016) A Large-Scale Interface for Optogenetic Stimulation and Recording in Nonhuman Primates. Neuron 89:927-39|
|Dadarlat, Maria C; O'Doherty, Joseph E; Sabes, Philip N (2015) A learning-based approach to artificial sensory feedback leads to optimal integration. Nat Neurosci 18:138-44|
|Chaisanguanthum, Kris S; Shen, Helen H; Sabes, Philip N (2014) Motor variability arises from a slow random walk in neural state. J Neurosci 34:12071-80|
|Makin, Joseph G; Fellows, Matthew R; Sabes, Philip N (2013) Learning multisensory integration and coordinate transformation via density estimation. PLoS Comput Biol 9:e1003035|
|Sabes, Philip N (2011) Sensory integration for reaching: models of optimality in the context of behavior and the underlying neural circuits. Prog Brain Res 191:195-209|
|McGuire, Leah M M; Sabes, Philip N (2011) Heterogeneous representations in the superior parietal lobule are common across reaches to visual and proprioceptive targets. J Neurosci 31:6661-73|
|Verstynen, Timothy; Sabes, Philip N (2011) How each movement changes the next: an experimental and theoretical study of fast adaptive priors in reaching. J Neurosci 31:10050-9|
|McGuire, Leah M M; Sabes, Philip N (2009) Sensory transformations and the use of multiple reference frames for reach planning. Nat Neurosci 12:1056-61|
|Simani, M C; McGuire, L M M; Sabes, P N (2007) Visual-shift adaptation is composed of separable sensory and task-dependent effects. J Neurophysiol 98:2827-41|
|Cheng, Sen; Sabes, Philip N (2007) Calibration of visually guided reaching is driven by error-corrective learning and internal dynamics. J Neurophysiol 97:3057-69|
Showing the most recent 10 out of 11 publications