The goal of the proposed research is to understand how the Visio-motor system integrates different cues for three-dimensional object geometry to guide motion of the hand during goal directed movements; particularly, pointing, prehension and object placement. The proposed experiments will investigate how the visual system integrates binocular and monocular cues to surface orientation and depth for both the planning and on-line control of hand movements. We will use the results of these experiments to compare the visual strategies used to control different motor behaviors, to control different motor components of the same behavior, and for different phases of motor control (planning vs. on-line control). The results will inform the question of which motor control functions rely on common visual computations and which rely on task-specific computations. We will also compare the visual strategies used to combine cues for motor control with those used to make perceptual judgments, as a west of the hypothesis that independent visual mechanisms subserve these broadly defined task domains. We will further investigate how feedback obtained during performance of goal-directed movements adapts the visual computations used to derive object information for guiding hand movements. Specifically, we will artificially adjust the correlations between different 3D cues and physical target object geometry (e.g. surface orientation) and measure induced changes in the cue weights used to estimate object geometry for motor control. We will measure transfer of learning between different visuo-motor tasks and between visuo-motor tasks and perceptual tasks as a further test of the task-specificity of visual computations. Experiments will be performed in a laboratory comprised of a virtual display system for display of visual target information, a robot arm for placement of physical targets for subjects' reaches and a real-time 3D motion tracking system for measuring hand movement kinematics. Novel computational analyses are described for quantifying the relative contributions of visual cues to motor control, both for motor planning and for on-line control of the hand during a movement. The research will elucidate the functional relationships between visual processes subserving different functions, both at the broad level of perceptual judgment vs. visuo-motor behavior and at a more refined level of one motor task vs. another and motor planning vs. on-line motor control.
Showing the most recent 10 out of 12 publications