Discovering the fundamental vision-motion primitives that are used by people to navigate and manipulate objects would lead to a natural human-machine interface for programming robotic systems especially in unstructured environments. In this proposal, we begin exploring the definition of a set of natural robotic interface commands used for navigation. A more natural navigational interface would allow robots to be more easily integrated into applications such as material handling for flexible manufacturing, planetary or underwater exploration, or automated wheelchairs. The programmer issues a command such as "go there" to the robot by specifying "there" as a location on a video screen. The robot then navigates to the desired location on a video screen. The robot then navigates to the desired location, avoiding any obstacles along the way. As a tool for discovering a complete set of navigational commands, we will implement and experiment with an initial set of commands. This experimentation should show where our initial command set is redundant or lacking and is essential for insuring that the commands are natural for the user. As part of this research, we will be developing a vision algorithm that can extract visually distinctive features from a wide variety of objects. The command set and the algorithms for feature extraction and tracking will be the largest contribution of this work. Future work involves further testing of the navigational commands on a convenient test platform and extending the methodology derived for discovering the primitive navigation commands to deriving manipulation commands.//

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
9210560
Program Officer
Howard Moraff
Project Start
Project End
Budget Start
1992-07-01
Budget End
1995-12-31
Support Year
Fiscal Year
1992
Total Cost
$100,254
Indirect Cost
Name
Northeastern University
Department
Type
DUNS #
City
Boston
State
MA
Country
United States
Zip Code
02115