In the last several years, robotics research has transitioned from being concerned exclusively with building fully autonomous and capable robots to include building partially-capable robots that collaborate with human partners, allowing the robot to do what robots do best and the human to do what humans do best. This transition has been fueled by a renaissance of safe, interactive systems designed to enhance the efforts of small- and medium-scale manufacturing, and has been accompanied by a change in the way we think robots should be trained. Learning mechanisms in which the robot operates in isolation, learning from passive observation of people performing tasks, are being replaced by mechanisms where the robot learns through collaboration with a human partner as they accomplish tasks together. This project will seek to develop a robot architecture that allows for new skills to be taught to a robot by an expert human instructor, for the robot to then become a skilled collaborator that operates side-by-side with a human partner, and finally for the robot to teach that learned skill to a novice human student. To achieve this goal, popular but opaque learning mechanisms will need to be abandoned in favor of novel representations that allow for rapid learning while remaining transparent to explanation during collaboration and teaching, in conjunction with a serious consideration of the mental state (the knowledge, goals, and intentions) of the human partner. A fundamental outcome of this work will be a unified representation linking the existing literature in learning from demonstration to collaborative scenarios and scenarios involving the robot as an instructor. Thus, project outcomes will have broad impact in application domains such as collaborative manufacturing, while also enhancing our substantial investment in education and training (especially research offerings for graduate and undergraduate investigators), and will furthermore enrich the efforts to broaden participation in computing.

This effort will build upon research in three subfields and extend the state-of-the-art to address deficiencies in each:

1 - Robot as Student. Building on work from Learning from Demonstration, the team will construct robots that learn task models from humans. However, to be useful to the other thrust areas, these models must not be opaque as many current learning techniques are. Instead, a transparent model will allow the robot to provide and ask feedback about its performance, explain what it has learned, and to proactively ask questions that speed up learning.

2 - Robot as Collaborator. The relatively new field of Human-Robot Collaboration struggles with synchronizing task execution between human and robot partners. By linking to models of learned task behavior and models of user intention and understanding, the team will construct systems that become proficient in negotiating task allocation, accommodating user preferences, and restoring/updating internal representations in case of errors or change of plans.

3 - Robot as Teacher. Fields including Intelligent Tutoring Systems build models of user knowledge, typically modeled using Bayesian knowledge tracing. These models, however, simply show knowledge as known, unknown, or forgotten, and only for factual knowledge. By linking with concrete representations of task and intent, the team will create robots that can detect, extend, or repair the mental model of a student for real-world tasks.

A set of milestones across three years will culminate in a demonstration of a robot that can learn a new task, collaborate on that task, and then teach that task to others.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1813651
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2018-08-15
Budget End
2021-07-31
Support Year
Fiscal Year
2018
Total Cost
$500,000
Indirect Cost
Name
Yale University
Department
Type
DUNS #
City
New Haven
State
CT
Country
United States
Zip Code
06520