This National Robotics Initiative project will contribute new knowledge at the convergence of several fields: virtual reality, robotic control, sensory feedback, ergonomics, and human factors. A customizable cobot will be designed to increase the fidelity of virtual reality simulations. The system will enable a user to feel the physical forces, movements, and constraints at both hands and both feet. By integrating a "whole-body" robotic system with a virtual reality platform, the PIs will advance the understanding of how robotics can be used to identify potential risks of human worker / robot collaborations, and for training towards reducing workplace risk exposures. The PIs will use modeling and simulation to evaluate and remove potential hazards to humans from collaborative robotic operations as well as to test collaborative robot and human interactions using simulated test beds. The effectiveness of the system will be evaluated via perceived presence, behavioral, and neurophysiological analysis. The project will advance the national prosperity by benefitting industry sectors that are likely to deploy collaborative robots (e.g., agriculture, construction, and healthcare). This project will benefit society by accelerating, through simulation, safety hazards that may arise when new technology is being designed, even before it gets deployed in the physical world. In addition, the PIs will engage a diverse pool of graduate and undergraduate students in the research. Project activities also include mentoring of high school students with disabilities to college and STEM training for first-generation elementary school students from rural areas through robotics summer camps.

This project will design, build, control, and assess the effectiveness of a body-scale physical interaction simulation cobot in Virtual Reality that provides customizable force and position feedback at the hands and feet. The platform (named ForceBot) is a novel cobot designed to dramatically increase the fidelity of virtual reality simulations. This project will significantly contribute to the field of human-robot interaction by exploring how robot-based active haptic simulation can adapt to a variety of tasks, environments, and people, with minimal modification to hardware and software. Project effort is organized into three objectives: development of techniques for haptic rendering of simulated interactions using a VR physics engine; implementing and controlling the ForceBot system; and evaluating the integrated system in a series of human subject experiments. The project will advance knowledge of robot dynamics and algorithms for personalized human motion recognition and prediction. ForceBot will enable workers to receive training on future work, identifying potential risks of collaborative robots to workers, and evaluation of different control strategies for wearable robots like exoskeletons. If successful, this research will expand the potential for using virtual reality for training intensive physical tasks in multiple application domains including sports, gaming, emergency response, and industrial applications for reducing workplace risk exposures.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
2020-10-01
Budget End
2024-09-30
Support Year
Fiscal Year
2020
Total Cost
$312,985
Indirect Cost
Name
University of Florida
Department
Type
DUNS #
City
Gainesville
State
FL
Country
United States
Zip Code
32611