The goal of this project is to develop and experimentally verify a skill-based approach to hand-eye coordination for robotic systems. The work focuses particularly on control of robot manipulators using weakly calibrated or uncalibrated stereo vision. The major innovations are: 1) the use of reconfigurable, feature-based tracking mechanisms that simplify image processing; 2) the use of projective invariant-based feedback controllers that perform correctly despite calibration error; and 3) the development of a taxonomy of geometric ``translation rules'' for converting the geometric specification of a task into a visual specification of a task. The research is driven by a series of benchmark problems chosen from the manipulation domain. In addition to software development and experimentation, theoretical methods for analyzing the stability of visual tracking and of hand-eye servoing systems will be developed. Methods for detecting and responding to execution errors will be investigated. The long-term goal of this work is to construct a system that can automatically synthesize and execute a vision-based task specification from a geometric task specification.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
9420982
Program Officer
Jing Xiao
Project Start
Project End
Budget Start
1995-07-15
Budget End
1999-06-30
Support Year
Fiscal Year
1994
Total Cost
$250,001
Indirect Cost
Name
Yale University
Department
Type
DUNS #
City
New Haven
State
CT
Country
United States
Zip Code
06520