Sonar is an attractive sensing modality for robotics because it is relatively simple to implement and process, has low cost and energy consumption, can be used in conjunction with camera vision, and for some applications it is the only practical sensing modality. This research investigates a form of adaptive sonar sensing, in which the sensor changes its physical configuration during the sensing process in reaction to the observed echoes. This approach, analogous to some approaches to active computer vision, is motivated by biological systems that turn their ears toward the direction of the sound source and move to observe an object from different directions. As part of this research, signal processing algorithms are investigated to develop a sensing system that can identify objects by their acoustic signatures. The current model of echo production is extended to predict the echoes observed by the sonar system. The sonar allows the robot to navigate around an unstructured environment precisely and to construct accurate sonar maps by using environmental objects as naturally-occurring beacons. This approach should support more autonomous and flexible operation of both mobile robots and robot arms, thus improving their functionality in manufacturing applications.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
9504079
Program Officer
Vladimir J. Lumelsky
Project Start
Project End
Budget Start
1995-08-01
Budget End
2001-07-31
Support Year
Fiscal Year
1995
Total Cost
$403,645
Indirect Cost
Name
Yale University
Department
Type
DUNS #
City
New Haven
State
CT
Country
United States
Zip Code
06520