We are the victims of our own success. We can now deploy mobile robots in real-world environments and have them operate completely autonomously for extended periods of time. We no longer have to surround our robots with graduate student wranglers to keep them functional, and to keep the general public at a safe distance. These technical successes mean that members of the general public must now interact directly with robots, without the aid of an interpreter. But members of the public are poorly equipped for such interactions, since they are unfamiliar with real robots and how they work. Thus, the interactions often go poorly; the robot is hindered in performing its task, and the human is unhappy. For people to be comfortable interacting with a robot, they must feel that they understand what it's thinking, what it's trying to do, and the actions that it will take. Moreover, people must be able to deduce this information from observing the robot for a short period of time, just as we do with other humans that we encounter. The fundamental problem here is that humans communicate a wealth of information by means of a non-verbal "vocabulary" in which body language (how we stand, how we hold our arms, etc.), eye contact, nods, and other subtle cues ostensibly not essential to the task at hand play significant roles. We do this naturally, and without conscious effort. Taken in context, this information allows us to infer another person's state of mind, goals, and intentions with surprising accuracy; this, in turn, allows us to predict how a given interaction will unfold, and gives us some control over it. Because people take this ability for granted, they suffer when it is absent, as is currently often the case when interacting with a mobile robot. The PI intends to address this deficiency in the current project. He argues that to make human-robot interactions as natural as possible, we must equip robots with our physical vocabulary and ensure that they use it appropriately, following social norms. To achieve this goal the PI will turn to the performing arts, where actors are trained to express themselves physically. A good actor can convey a vast amount of information about a character's state of mind, goals, and intentions by simply walking across the stage in a particular way. The actions may be styled, larger-than-life, or subtle, but they are intended to convey information about the character's internal mental state. The techniques that actors employ have been honed and refined for hundreds of years and tested for effectiveness on the general public. In this research, the PI will exploit such insights and skills to develop a physical vocabulary that can communicate beliefs, intentions, and goals to humans interacting with a robot, thereby enabling people to better predict the robot's actions. Finally, the PI will rigorously evaluate these actions to verify that they are actually useful.

Broader Impacts: Robots are becoming more and more a part of our lives, and members of the public will be forced to deal with them sooner or later. If we have an understanding of the physical aspects of these interactions, the integration of robots into our everyday lives will be made much less painful and distressing.

Project Report

The goal of this work is to understand how robots can use "body language" to let them work better with people. When we interact with other people, we use a variety of (often subtle) body language to show our intentions to others. For example, when we meet someone in a narrow corridor, we often look at them to indicate that we're giving them the right of way, and then move to the right hand side of the corridor. These sorts of actions, which aren't really necessary to accomplish the tasks, and the social rules that they follow, make our interactions with other people more efficient and effective. The central question of this work is whether or not the same sorts of thing will help human-robot interactions. The main things that we learned from the work are that it is important for the robots to treat people differently from the way they treat obstacles. For example, a table does not care how close the robot comes to it, while a human has a very well-defined "comfort zone", and gets uneasy if the robot (or another human, for that matter) gets too close. Recognizing people, figuring our where they're going, and respecting this comfort zone is going to be very important if we are ever going to see robots and people work together in the real world. We also learned that the gestures that people make when interacting with others can often mean different things when done by a robot. When a human nods or makes eye contact, there is a lot of subtlety in that gesture. Most robots are not capable of this level of subtle movement, so their gestures are more easily misunderstood. One of the biggest practical things to come out of this work was a new set of software that lets robots navigate around the world, and interact appropriately with people. This is now part of a very widely-used set of software for all sorts of robots, and there is a good chance that it will start to make it into commercial products at some point in the future. The things that we found out as part of this work will make robots interact more efficiently and effectively with people in the real world, using the new software that we wrote. A number of graduate students and undergraduates from all over the United States contributed to this work as part of their degree programs. All of the software that was written during this project has been made freely-available to both the academic and industrial robotics communities, in the hopes that it will prove useful, and that others will build on the work we have started.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1258213
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2012-06-01
Budget End
2013-08-31
Support Year
Fiscal Year
2012
Total Cost
$97,098
Indirect Cost
Name
Oregon State University
Department
Type
DUNS #
City
Corvallis
State
OR
Country
United States
Zip Code
97331