One of the important functions of visual inputs is to guide motor activity, especially eye movements. Primates use smooth pursuit eye movements to rotate their eyes smoothly. Pursuit keeps the eyes pointed at small, slowly-moving objects, a function that is much more poorly expressed in most other species. Moving objects cause neural activity in a part of the visual cortex called the "middle temporal visual area" or MT, where neurons respond only to moving objects and encode the direction and speed of object motion. MT provides the visual inputs for smooth pursuit eye movements, but speed and direction are encoded only in the response of a large number of MT neurons, and not in the responses of any individual neuron. Therefore, the population response in MT needs to be "decoded" to provide motor commands that indicate the required direction and speed of smooth eye velocity. This proposal uses awake, behaviorally-trained rhesus monkeys to ask how visual population decoding is done by neural circuits. First, it proposes to use a combination of recordings of the first 100 ms of pursuit eye movements and the electrical activity of MT neurons to explore the neural basis for an effect of stimulus form on pursuit variation. The approach is to characterize the effect of changes in stimulus form on the variation in eye movement, and then to search for parallel effects on the responses of certain sub-populations of MT neurons. Second, the proposal will use a combination of experiment and computation to constrain how neural circuits decode visual population responses. The approach will create a library of circuit models that make different predictions for the results of doable experiments. Predictions will be compared with experiments that assess the correlations between the trial-by-trial variations of MT responses and pursuit eye movements. Special attention will be paid to how the MT-pursuit correlations vary as a function of the preferred speed of an MT neuron, a feature that is particularly sensitive to the details of how the decoding model is implemented. Third, the proposal will quantify the magnitude of noise added to sensory signals in the motor system, downstream from the decoding computation. As a whole, the proposal will inform an understanding of the neural circuit mechanisms that are used for visual population decoding in the brain.

Public Health Relevance

Many of our actions are guided by what we see. This application asks how inputs through our eyes are processed by the brain and converted into contractions of the muscles that will cause us to move our eyes, focusing on eye tracking of moving objects. Understanding how visual inputs guide movement in normal subjects will inform diagnosis and therapy of eye movement disorders. It will be especially important for relieving the double vision caused by misalignments of the eyes and the poor vision that results from eye movement disorders that make it difficult to track moving objects.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY003878-33
Application #
8534119
Study Section
Central Visual Processing Study Section (CVP)
Program Officer
Steinmetz, Michael A
Project Start
1981-09-01
Project End
2014-08-31
Budget Start
2013-09-01
Budget End
2014-08-31
Support Year
33
Fiscal Year
2013
Total Cost
$223,725
Indirect Cost
$81,225
Name
Duke University
Department
Biology
Type
Schools of Medicine
DUNS #
044387793
City
Durham
State
NC
Country
United States
Zip Code
27705
Joshua, M; Lisberger, S G (2015) A tale of two species: Neural integration in zebrafish and monkeys. Neuroscience 296:80-91
Yang, Yan; Lisberger, Stephen G (2014) Purkinje-cell plasticity and cerebellar motor learning are graded by complex-spike duration. Nature 510:529-32
Yang, Yan; Lisberger, Stephen G (2014) Role of plasticity at different sites across the time course of cerebellar motor learning. J Neurosci 34:7077-90
Lee, Joonyeol; Yang, Jin; Lisberger, Stephen G (2013) Control of the gain of visual-motor transmission occurs in visual coordinates for smooth pursuit eye movements. J Neurosci 33:9420-30
Hohl, Sonja S; Chaisanguanthum, Kris S; Lisberger, Stephen G (2013) Sensory population decoding for visually guided movements. Neuron 79:167-79
Huang, Xin; Lisberger, Stephen G (2013) Circuit mechanisms revealed by spike-timing correlations in macaque area MT. J Neurophysiol 109:851-66
Niu, Yu-Qiong; Lisberger, Stephen G (2011) Sensory versus motor loci for integration of multiple motion signals in smooth pursuit eye movements and human motion perception. J Neurophysiol 106:741-53
Hohl, Sonja S; Lisberger, Stephen G (2011) Representation of perceptually invisible image motion in extrastriate visual area MT of macaque monkeys. J Neurosci 31:16561-9
Churchland, Mark M; Yu, Byron M; Cunningham, John P et al. (2010) Stimulus onset quenches neural variability: a widespread cortical phenomenon. Nat Neurosci 13:369-78
Lisberger, Stephen G (2010) Visual guidance of smooth-pursuit eye movements: sensation, action, and what happens in between. Neuron 66:477-91

Showing the most recent 10 out of 73 publications