Despite the tremendous improvements in computational power of recent years, machine-vision systems are still far from replicating the efficiency, robustness, and speed of biological systems. A critical difference between organisms and machines lies in the acquisition of visual information. Unlike computers, biological vision systems are not passively exposed to the visual scene. Instead, they actively seek useful information by means of goal-directed behavior. It is a long-standing proposal that forging a tight link between behavior and perception may be critical for developing more efficient machine-vision algorithms.
While examining a scene, humans coordinate eye movements with small movements of the head and body. Coordinated head/eye movements provide 3D information in the form of parallax, the different apparent motion of stationary objects at different distances. To examine the impact of this behavior in 3D vision, this project integrates computer modeling of the visual cortex with experiments in robotic vision and human psychophysics. The specific aims of this project are to: (a) measure the influence of coordinated head/eye movements on the accuracy of depth and distance judgments in human observers; (b) measure the 3D information resulting from head/eye movements by replicating human motor activity in an anthropomorphic robot; and (c) model the extraction and the autonomous calibration of the parallax resulting from head/eye movements in the parietal cortex of macaques. By coupling a neural model of the brain with a robot that replicates human behavior, this research establishes a direct link between human and machine vision studies. It has the potential of providing new insights on the brain as well as opening the way to the development of new algorithms in machine vision.