Modern robots can be seen moving about a variety of terrains and environments, using wheels, legs, and other means, engaging in life-like hopping, jumping, walking, crawling, and running. They execute motions called gaits. An example of a gait is a horse trotting or galloping. Likewise, humans execute walking, running and skipping gaits. Essentially, for either a biological or mechanical systems, a gait is a locomotion pattern that involves large-amplitude body oscillations. Naturally, these motions cause impacts with terrain that jostle on-board perceptual systems and directly influence what the robots actually "see" as they move. For instance, the body motion of a bounding horse-like robot may result in significant occlusions and oscillations in on-board camera systems that confound motion estimation and perceptual feedback.

Focusing on complex mobility robots, this project seeks to better understand the coupling between locomotion and visual perception to improve perceptual feedback for closed-loop motion estimation. The work is organized around two key questions: 1) How should a robot look to move well? 2) How should a robot move to see well? To address the first challenge, the periodic structure of gait-based motions will be leveraged to improve perceptual filtering as the robot carries out fixed (pre-determined) motions. The second half of the project will derive perceptual objectives and a new perceptual gait design framework to guide how high degree-of-freedom, complex mobility robots should move (locomote). The goal is to optimize feedback for closed-loop motion implementation, on-line adaptation, and learning, which are currently difficult or impossible for many complex mobility robots.

Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
Carnegie-Mellon University
United States
Zip Code