Robots designed for social interactions can enrich the quality of life for individuals and society in a myriad of ways, such as automating undesirable physical work, supporting activities of daily living, and facilitating social connections. However, seamless integration of robots into society depends on both people and robots understanding how to communicate naturally and effectively with each other. Humans need to understand the capabilities of a robot and develop trust in the robot. Robots need to understand the capabilities and interests of the human and adapt their behavior accordingly. This project will use virtual reality and augmented reality technologies to help people and robots to better understand each other before they meet in person. During a Shared Virtual Teaching Experience (SVTE), human users will be able to interact with a virtual version of a robot using virtual reality and augmented reality technologies. The SVTE will help both the robot and the users learn how to communicate with each other, build trust and rapport by sharing information about themselves, and adapt based on what each learns from the other. While an SVTE is a general approach for improving human-robot interactions, this project will focus on the needs of older adults, who may be hesitant to use and trust new technologies, including robots. Research has shown that older adults can benefit from social interactions with robots, especially when the robots help to create social connections with other older adults. This project helps to develop new approaches, leveraging virtual reality and augmented reality, to help users and robots work together. It will address societal needs of an aging population using socially assistive robots and advance the state of the art in human-robot interaction. It will also involve K-12 students in order to stimulate interest in science and engineering while engaging with the elderly community in the context of a real-world need for human-centered technology.

This project uses immersive communication modalities of virtual reality (VR) and augmented reality (AR) to overcome the barriers to social communication that hinder seamless integration of co-robots into human everyday lives. Specifically, the project is developing an immersive mixed reality experience, Shared Virtual Teaching Experience (SVTE), allows human users to get accustomed to robots, and vice versa, before interacting together in the real world. Both VR and AR are potentially useful for this purpose, since VR provides a consistent graphical environment that makes communication clear for users, while AR allows for situating the interaction with a simulated robot in a real, physical environment, while benefiting from graphical enhancements. This project develops SVTE pre-exposures in both VR and AR formats to enable immersive user training on how to communicate with the robot and to understand its functional and affective limitations. Throughout SVTE interactions, the system will collect information about users that will allow the robot to adapt for personalized interactions in the physical world. To evaluate the SVTE, during development and when completed, this project will perform a series of user studies, first with university students and then with older adults in a senior living facility. In the SVTE evaluation with elderly users, this work will focus on the use of robots for assisting in social engagement and facilitation for preventing social isolation, which has been shown to raise morbidity and mortality. Overall, this project will include the development of a mixed reality open-source testbed capable of communicating with physical robots via ROS and virtual communication strategies for robots. Using data from those virtual modalities, this work will generate multimodal user models that are typically difficult to perceive and create within the physical world. The models will, in turn, allow non-experts to understand how to effectively interact with physical robots. This research will provide a framework for evaluating multimodal user models for naturally communicating, adapting, and personalizing human-robot interactions in various real world contexts.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
2019-10-01
Budget End
2022-09-30
Support Year
Fiscal Year
2019
Total Cost
$750,000
Indirect Cost
Name
University of Southern California
Department
Type
DUNS #
City
Los Angeles
State
CA
Country
United States
Zip Code
90089