The investigators of this project envision a world where robots surround us, in our homes, in our hospitals, and in our factories, helping people by delivering medicine, preparing food, and assembling objects. Achieving this vision requires robots to communicate with people about their needs, and then plan their activities to help meet those needs. Previous research has addressed these two problems separately, leading to technical solutions that do not work reliably in real-world situations, and to difficulties in human-robot communication. To solve these problems, we are developing the Physically-Grounded Language with Affordances (PGLA) framework and concentrate our research into two thrusts: 1) enable a robot to observe a patient, then answer a nurse's questions about the patient's activity, and 2) enable a robot to respond to natural language requests in a collaborative cooking task and in a manufacturing setting. We will release our open-source data sets and code, which will have impact in other technical areas beyond robotics, such as computer vision and machine learning. The results of our proposed research will find direct applications in industries such as manufacturing and assistive robotics.

This project takes a probabilistic approach to jointly learn to recognize affordances in the environment and predict associated natural language requests and descriptions. Since the affordance map is grounded to perceptual data, our robots will learn to robustly manipulate objects in the physical world, respond to natural language commands, and describe their experiences using words. Our learning approach enables the robot to infer cross-model knowledge from large data sets of people carrying out activities paired with natural language descriptions of the activities, leveraging the strength of each modality to inform the others. Our novel learning algorithms will integrate and learn from multi-domain databases such as the semantic web, visual scenes, and a novel activity database paired with natural language descriptions.

Project Start
Project End
Budget Start
2014-08-01
Budget End
2017-07-31
Support Year
Fiscal Year
2014
Total Cost
$347,622
Indirect Cost
Name
Brown University
Department
Type
DUNS #
City
Providence
State
RI
Country
United States
Zip Code
02912