This research aims to create large-scale teams of human and robot peers that operate side-by-side in the same physical space, with each human and robot performing physical actions based upon their own skills and capabilities. The intent is to generate an interaction style that is not based on direct commands and controls from humans to robots, but rather on the idea that robots can implicitly infer the intent of human teammates through passive observation, and then take appropriate actions in the current context. In this interaction, humans perform tasks in a very natural manner, as he/she would when working with a human teammate, thus bypassing the difficulty of cognitive overload that occurs when humans are required to explicitly supervise the actions of several robot team members. This research can revolutionize how humans and robots work together in applications such as search and rescue, firefighting, security, defense, light construction, manufacturing, home assistance, and healthcare.

This research focuses on two key challenges: (1) how robots can determine humans' current goals, intents, and activities via sensor observation only, and (2) how robots can respond appropriately to help humans with the ongoing task, consistent with the inferred human intent. Input to the robot system is a set of learned models, along with color and depth sensing. Models are learned using novel features for human perception and representation, including Depth of Interest features, 4-dimensional local spatio-temporal features, adaptive human-centered features, and simplex-based orientation descriptors. Learning techniques make use of novel maximum temporal certainty models for sequential activity recognition, and conditional random fields for environmental monitoring. Robot activity selection is achieved via a novel risk-aware cognitive model. The outcome of this research will be new software methodologies enabling robot cognition, learning, sensing, perception, and action selection for peer-to-peer human-robot teaming.

Project Start
Project End
Budget Start
2014-08-01
Budget End
2018-07-31
Support Year
Fiscal Year
2014
Total Cost
$521,309
Indirect Cost
Name
University of Tennessee Knoxville
Department
Type
DUNS #
City
Knoxville
State
TN
Country
United States
Zip Code
37916