Objects move through the environment in three dimensions, and humans are clearly capable of perceiving such motion in depth. Visual neuroscientists do not, however, know how our nervous system encodes this fundamental aspect of the visual world. Despite large literatures motion and depth perception, there is a surprising lack of knowledge integrating these two visual features to directly characterize how the brain processes the three-dimensional direction of moving objects. The goal of this proposed research is to understand how neural circuits in the primate brain exploit binocular information to represent the direction of objects moving through a 3D environment. First, we will psychophysically characterize the binocular cues to 3D motion. Recent work suggests that the perception of motion through depth relies on two binocular cues, one based on changing disparities over time, and one based on an inter-ocular comparison of velocities. Our overarching hypothesis is that the velocity-based cue is of great importance for the perception of 3D motion. We will therefore perform psychophysical experiments that isolate eye-specific motion signals and characterize them relative to (and in interaction with) disparity-based signals. These experiments will unpack the psychophysical building blocks and signatures of this important perceptual information, and refine visual displays used for neuroimaging and electrophysiological studies. Second, we will use neuroimaging to identify the neural circuits that process 3D motion. Taking stimuli and insights from our psychophysical experiments, we will perform fMRI experiments to visualize this processing in the human brain. Direction-selective adaptation protocols will be used to characterize both the disparity-based and velocity-based cues and their interactions, and to understand how (or if) these cues are integrated into a single (cue-independent) representation of 3D motion. These experiments will also assess how directly psychophysical assays of the system map on to neural signals. Third, we will perform electrophysiology to specify the underlying neural computations. Guided by the psychophysics and neuroimaging, we will perform single-neuron recordings to characterize the binocular neural signals that encode 3D motion. Recordings in V1 will employ multi-electrode arrays;recordings in MT will employ both multiple-tetrode and single-electrode awake preparations. This work will test the hypothesis that eye-specific motion signals are represented at the level of V1 and are then integrated (by single neurons) in MT. This 3D motion pathway may be exposed at slower speeds, and thus may be multiplexed in the same circuitry known to extract 2D/frontoparallel direction when assessed using faster speeds.

Public Health Relevance

Objects in the real world move through three dimensions, and humans rely on their ability to sense motion in depth in order to guide appropriate actions. A thorough understanding of the neural basis of the perception of 3D motion will prove valuable for the refinement of biomedical technologies involving the visually-guided control of robotic or prosthetic devices in a dynamic three-dimensional environment.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY020592-03
Application #
8535770
Study Section
Central Visual Processing Study Section (CVP)
Program Officer
Steinmetz, Michael A
Project Start
2011-09-01
Project End
2015-08-31
Budget Start
2013-09-01
Budget End
2014-08-31
Support Year
3
Fiscal Year
2013
Total Cost
$364,698
Indirect Cost
$111,523
Name
University of Texas Austin
Department
Biology
Type
Schools of Arts and Sciences
DUNS #
170230239
City
Austin
State
TX
Country
United States
Zip Code
78712
Joo, Sung Jun; Katz, Leor N; Huk, Alexander C (2016) Decision-related perturbations of decision-irrelevant eye movements. Proc Natl Acad Sci U S A 113:1925-30
Joo, Sung Jun; Czuba, Thaddeus B; Cormack, Lawrence K et al. (2016) Separate Perceptual and Neural Processing of Velocity- and Disparity-Based 3D Motion Signals. J Neurosci 36:10791-10802
Greer, Devon A; Bonnen, Kathryn; Huk, Alexander C et al. (2016) Speed discrimination in the far monocular periphery: A relative advantage for interocular comparisons consistent with self-motion. J Vis 16:7
Katz, Leor N; Hennig, Jay A; Cormack, Lawrence K et al. (2015) A Distinct Mechanism of Temporal Integration for Motion through Depth. J Neurosci 35:10212-6
Goonetilleke, Samanthi C; Katz, Leor; Wood, Daniel K et al. (2015) Cross-species comparison of anticipatory and stimulus-driven neck muscle activity well before saccadic gaze shifts in humans and nonhuman primates. J Neurophysiol 114:902-13
Bonnen, Kathryn; Burge, Johannes; Yates, Jacob et al. (2015) Continuous psychophysics: Target-tracking to measure visual sensitivity. J Vis 15:
Czuba, Thaddeus B; Huk, Alexander C; Cormack, Lawrence K et al. (2014) Area MT encodes three-dimensional motion. J Neurosci 34:15522-33
Eastman, Kyler M; Huk, Alexander C (2012) PLDAPS: A Hardware Architecture and Software Toolbox for Neurophysiology Requiring Complex Visual Stimuli and Online Behavioral Control. Front Neuroinform 6:1
Huk, Alexander C (2012) Multiplexing in the primate motion pathway. Vision Res 62:173-80
Czuba, Thaddeus B; Rokers, Bas; Huk, Alexander C et al. (2012) To CD or not to CD: Is there a 3D motion aftereffect based on changing disparities? J Vis 12:7

Showing the most recent 10 out of 11 publications