The recent pandemic outbreaks, including Ebola, Zika and the 2019 Novel Coronavirus (2019-nCoV), urge tele-medicine to go beyond mere tele-presence, to achieve robots that perform real-world nursing assistance tasks that require the coordinated control of manipulation, locomotion, and active teleoperation. Remotely-controlled nursing robots provide a promising alternative for quarantine and remote patient care. However, the traditional and contemporary human-robot interfaces fundamentally limit the performance and user experience of nursing robot teleoperation, and may reinforce burden and safety concerns that discourage healthcare workers to adopt robots. To address this problem, this project will (1) develop an innovative integration of transparent and intuitive teleoperation interface, to support the freeform and coordinated motion control of the remote nursing robots, and (2) integrate this interface with the robot intelligence to enable nursing professionals to learn robot teleoperation with minimal training, and to reduce the physical and cognitive workload using shared autonomy. The proposed project will promote the progress of science in human-robot interfaces for robot teleoperation, and advance the quality, availability and sustainability of healthcare in the present and future pandemic crisis. This project will have significant impacts on the domain of nursing, which consists of 2.9 million registered nurses and 160,000 nurse practitioners across the U.S. It will revolutionize patient-care in quarantine, and has the potential to extend to in-home care, clinics, and hospitals given the upcoming shortage of nursing workforce. The fundamental research also generalizes to other worker domains with robot tele-operations, including warehouse, social service, and maintenance. The proposed research will forge substantial collaboration among faculty and students in robotics engineering, nursing and social science.
This project consists of two research themes. Research Theme 1 will develop a soft-robot teleoperation interface architecture and systematic human-inspired motion mapping strategies, to support the intuitive and transparent mapping of the motion, force, and perception information between humans and robots. The proposed interface will enable transparent and legible robot behavior of reaching-to-grasp, loco-manipulation, and the control of active telepresence. Research Theme 2 will develop the intelligence of the interface, to enable interactive learning and mutual adaptation between humans and robots. Based on game-theoretic planning, it will develop adaptive shared autonomous strategies that use human-robot communication via haptic feedback. It will employ active tele-presence to enhance the training and reduce workload in tele-operation of the human operator. The integrated interface will be evaluated in comprehensive user studies with registered nurses, nursing faculty and nursing students. The evaluation will assess the performance and user experience, including human-robot teaming, using efficiency, workload and interface effort metrics. It will also evaluate the social impacts of the proposed human-robot interface on the acceptance and adoption of nursing robots by the current and future nursing workforce.
This proposal was funded with the National Institute for Occupational Safety and Health (NIOSH) in the Center for Disease Control and Prevention (CDC).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.