Studies of high-acuity visual processing in awake animals are invariably faced with the difficulty of accounting for the effects of eye movements on the retinal stimulus. Animals that are well-trained to maintain precise fixation continuously make involuntary eye movements, composed of both microsaccades and drift. Even in anaesthetized and paralysed animals, the eyes undergo slow drifts, as well as movements tied to heartbeat and respiration. The resulting uncertainty in the retinal stimulus can be large relative to the fine receptive fields (RFs) of neurons in primary visual cortex (V1), which are fixed in retinotopic coordinates. Conventional eye-tracking methods, such as implanted scleral eye coils and optical tracking techniques, have accuracies comparable to the magnitude of the fixational eye movements themselves (about 0.1 degrees), making them ill-suited to correct for such fine-grained changes in eye position. Thus, without accurately accounting for eye movements, the stimulus presented to such neurons is both uncontrolled and unknown, greatly limiting analyses of neural stimulus processing, cortical variability and the role of eye movements in visual processing. This is especially true for V1 neurons representing the central portion of the visual field (the fovea), which have extremely small RFs. As a result, relatively little is known about whether they process visual stimuli differently from neurons representing the non-foveal visual field, which is an important question given the overrepresentation of the fovea throughout visual cortex10 and the critical role the fovea has in a variety of high-acuity visual behaviors. Although basic tuning properties of foveal V1 neurons have been measured the detailed functional descriptions of V1 stimulus processing that have been developed for parafoveal neurons have yet to be tested, and important questions about functional specialization in the fovea remain. To address these problems, here we present a method for inferring an animals eye position using the activity of the V1 neurons themselves, leveraging their finely tuned RFs to derive precise information about the position of the stimulus on the retina. Our method utilizes multielectrode recordings and a recently developed nonlinear modelling approach to estimate an animals eye position, along with its associated uncertainty, with the high spatial and temporal resolutions needed to study foveal V1 neurons. We demonstrate this approach using multielectrode array recordings from awake behaving macaques, and show that it allows for estimation of eye position with an accuracy on the order of 1 min of arc. Our method yields eye-tracking improvements in both foveal and parafoveal recordings, and is robust to the number and composition of the recorded units. Using this method allows us to obtain detailed functional models of the stimulus processing of foveal V1 neurons that are otherwise largely or entirely obscured by eye movements. In addition to allowing detailed analyses of high-resolution stimulus processing, our method can identify and correct for the effects of eye movements on measures of cortical variability, which has important implications for studies of neural coding more generally.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Investigator-Initiated Intramural Research Projects (ZIA)
Project #
1ZIAEY000404-14
Application #
9155566
Study Section
Project Start
Project End
Budget Start
Budget End
Support Year
14
Fiscal Year
2015
Total Cost
Indirect Cost
Name
U.S. National Eye Institute
Department
Type
DUNS #
City
State
Country
Zip Code
Quaia, Christian; Optican, Lance M; Cumming, Bruce G (2018) Binocular summation for reflexive eye movements. J Vis 18:7
Seemiller, Eric S; Cumming, Bruce G; Candy, T Rowan (2018) Human infants can generate vergence responses to retinal disparity by 5 to 10 weeks of age. J Vis 18:17
Quaia, Christian; Optican, Lance M; Cumming, Bruce G (2017) Suppression and Contrast Normalization in Motion Processing. J Neurosci 37:11051-11066
Quaia, Christian; Optican, Lance M; Cumming, Bruce G (2017) Combining 1-D components to extract pattern information: It is about more than component similarity. J Vis 17:21
Tarawneh, Ghaith; Nityananda, Vivek; Rosner, Ronny et al. (2017) Invisible noise obscures visible signal in insect motion detection. Sci Rep 7:3496
Clery, Stephane; Cumming, Bruce G; Nienborg, Hendrikje (2017) Decision-Related Activity in Macaque V2 for Fine Disparity Discrimination Is Not Compatible with Optimal Linear Readout. J Neurosci 37:715-725
Joiner, Wilsaan M; Cavanaugh, James; Wurtz, Robert H et al. (2017) Visual Responses in FEF, Unlike V1, Primarily Reflect When the Visual Context Renders a Receptive Field Salient. J Neurosci 37:9871-9879
Read, Jenny C A; Cumming, Bruce G (2017) Visual Perception: Neural Networks for Stereopsis. Curr Biol 27:R594-R596
McFarland, James M; Cumming, Bruce G; Butts, Daniel A (2016) Variability and Correlations in Primary Visual Cortical Neurons Driven by Fixational Eye Movements. J Neurosci 36:6225-41
Cumming, Bruce G; Nienborg, Hendrikje (2016) Feedforward and feedback sources of choice probability in neural population responses. Curr Opin Neurobiol 37:126-132

Showing the most recent 10 out of 36 publications