This project focuses on tackling a critical barrier to long-term autonomy for robotic systems, namely the lack of theoretically well-founded self-calibration methods for inertial and vision-based sensors, commonly found on sophisticated robots. The project is motivated by the vision of power-up-and-go robotic systems that are able to operate autonomously for long periods without requiring tedious manual sensor calibration. The research team addresses this problem in the context of vision-based mobile manipulation and navigation. The core foci of the work are: 1. the development of a unified mathematical theory of anytime, automatic calibration for visual-inertial systems, and 2. an experimental characterization of the resulting algorithms with state-of-the-art, sophisticated robots of significant diversity (humanoids performing mobile manipulation and autonomous ground vehicles navigating outdoors). Inertial sensing is critically important for humanoid balance control, while visual sensing relates the 3D world to the robot's body coordinates thereby enabling manipulation. In the case of autonomous ground vehicles, monocular and stereo camera calibration is still commonly performed manually using a known calibration target. The project obviates the need for this requirement. The expected outcomes of the project are: 1. a theoretical foundation for humanoid robots to function autonomously in unstructured environments over significant periods of time, and 2. new navigation algorithms for ground vehicles allowing them to see further with greater acuity. The project explicitly incorporates undergraduate research in cooperation with an REU site currently operational at the USC Computer Science Department.

Project Start
Project End
Budget Start
2010-08-15
Budget End
2016-07-31
Support Year
Fiscal Year
2010
Total Cost
$449,271
Indirect Cost
Name
University of Southern California
Department
Type
DUNS #
City
Los Angeles
State
CA
Country
United States
Zip Code
90089