Stereopsis, the process of extracting depth information from differences in the images on the two eyes, is one of the most important sources of 3D information in humans. Much has been learned about how the visual system identifies corresponding points in the two retinas and the neural mechanisms responsible for estimating disparity. Virtually all previous studies have focused on spatial characteristics of the retinal images, paying little attention to eye movements. Humans continually move their eyes, even when attempting to hold gaze on a single point. Fixational eye movements (FEM), which include small saccades (microsaccades) and incessant eye drifts, constantly move the image on the retinas. FEM are now known to contribute to the processing of luminance signals by transforming space into temporal modulations on the retina, enhancing fine spatial details. My own previous research has shown that monocular FEM contribute 0.15 logMAR to visual acuity, approximately two lines on an eye chart. Yet very little is known about: (a) the characteristics of FEM in the two eyes as humans make fine depth judgments; (b) whether FEM contribute to the processing of disparity signals; and (c) how standard models of disparity-sensitive neurons are affected by the real input signals resulting from FEM. Until recently, experimental investigation of these questions would not have been feasible because of the technical challenges of precisely recording FEM in both eyes during natural viewing and of controlling retinal stimulation. Our lab has now developed new systems for accurate eye-tracking and real-time gaze-contingent control to overcome these challenges. The objective of this proposal is to investigate the impact of FEM on stereopsis at the behavioral, perceptual, and computational levels. Our main hypothesis is that FEM contribute to fine depth j by introducing disparity modulations that drive the neural mechanisms of stereopsis. To test this hypothesis, we plan to address three aims.
In Aim 1, using new tools to precisely measure FEM, we will provide the first high-resolution characterization of behavior during fixation and fine depth judgments of targets at different distances. While the precision of monocular fixation is now well established in 2D, the precision of binocular fixation in depth remains unknown, even though this behavior determines the actual input to the visual system.
In Aim 2, we will experimentally examine whether disparity modulations introduced by FEM affect stereopsis. Using a custom system for gaze-contingent display to control stimulation of both retinas, we will assess fine depth perception in the presence and absence of FEM.
Aim 3 will focus on the impact of FEM on neural computations. We will examine how the temporal input signals to the retinas measured in Aims 1 and 2 interact with known characteristics of binocular neurons in stereopsis. Unlike standard models of stereopsis, FEM keep both retinal images in motion. This research will reveal whether (a) the visual system uses these motion signals to encode disparity, as we hypothesize; or, alternatively and equally important, (b) elucidate the mechanisms by which it copes with changes in disparity caused by FEM.

Public Health Relevance

Humans constantly move their eyes while gathering visual information. By investigating the functions of binocular eye movements, this project will help understanding how the visual system extracts depth information from the signals in the two eyes and whether abnormal eye movements impair this process.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Predoctoral Individual National Research Service Award (F31)
Project #
1F31EY029565-01
Application #
9610984
Study Section
Special Emphasis Panel (ZRG1)
Program Officer
Agarwal, Neeraj
Project Start
2018-09-01
Project End
2021-08-31
Budget Start
2018-09-01
Budget End
2019-08-31
Support Year
1
Fiscal Year
2018
Total Cost
Indirect Cost
Name
Boston University
Department
Pharmacology
Type
Schools of Medicine
DUNS #
604483045
City
Boston
State
MA
Country
United States
Zip Code