This is the first-year funding of a three-year continuing award. The goal of this research is to study the execution of complex visual-motor tasks by a robot using a manipulator under visual control, based on an approach in which perceptual-motor- inference sequences are generated by accessing a "skill-base" of prior instances of perceptual-motor problem solving. The ideas will be tested by developing a "visual" robot that can build a brick wall. To execute complex tasks such as this, a robot must be able to decide whether to execute a manipulator action such as picking up a brick (in order to see it better, for example), a perceptual action such as changing the viewpoint, or a computational action such as inferring new visual features from the image data. The system will look where it looks and move things the way it does based on what it did last time it was in a similar situation. The system will access a library of stored cases, each describing a sequence of perceptual-motor actions, and will find and adapt the cases whose process components are most similar to the situation facing the system. Interaction thus becomes recalled process. The system will demonstrate how useful robotic eye-hand technology can be developed by exploiting perceptual-motor knowledge stored as cases. The approach should be widely applicable to problems involving integrated action and perception.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
9421483
Program Officer
C.S. George Lee
Project Start
Project End
Budget Start
1995-09-01
Budget End
1997-08-31
Support Year
Fiscal Year
1994
Total Cost
$80,000
Indirect Cost
Name
Northwestern University at Chicago
Department
Type
DUNS #
City
Evanston
State
IL
Country
United States
Zip Code
60201