The goal of this Faculty Early Career Development (CAREER) grant is to enable legged robots that are as sure-footed and agile on rough terrain as snow leopards, mountain goats, and human parkour experts. The project considers two critical questions towards achieving this goal. The first is how the robot can use cameras to decide where and how to take its next few steps. The second is how to make sure that upsets caused by small stumbles and slips fade away over time, instead of growing bigger and eventually causing a fall. The first question will be partially answered by observing how vision and locomotion are connected in humans, and translating that relationship to legged robots. The second question requires extending current mathematical ideas about running and walking. For example, a legged robot might run along a dry path, walk slowly to cross a muddy patch, and then carefully hop on stepping-stones to cross a stream. Current approaches pretend that each basic step is repeated forever. In contrast, this project allows different basic patterns of steps to be combined in any order. Without the ability to navigate realistic terrain and respond to unplanned obstacles, legged robots will be restricted to highly controlled settings. This project will help bring these robots into real-world usage, for tasks involving exploration and security, moving in factories and warehouses, and assisting humans in everyday tasks. The education portion of this project will teach expert skills in dynamics and control through competitive student activities under creativity-enhancing constraints.

This project will create control techniques for highly dynamic robots that can run rapidly on extreme terrain, and transition between walking, running, vaulting, jumping, and landing on discrete footholds. These movements require rapid switching between dynamical behaviors based on perceived information of the world. An extension of current theory is needed to analyze the resulting aperiodic locomotion. This project invokes fractal-based theory to define and prove formal stability guarantees. The project also unites deep learning and formal control theory to model and emulate human visual-motor integration for translation to legged robots.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
2020-03-01
Budget End
2025-02-28
Support Year
Fiscal Year
2019
Total Cost
$586,096
Indirect Cost
Name
University of California Berkeley
Department
Type
DUNS #
City
Berkeley
State
CA
Country
United States
Zip Code
94710