The objective of this research is to answer fundamental design questions for multi-functional robotic skin sensors, optimize their placement onto assistive robotic devices, have the robot and human "learn" how to use the skin sensors efficiently, and quantitatively assess the impact of this assistive technology to humans. The approach is to design and fabricate integrated micro-scale sensors in conjunction with iterative simulation and experimental studies of the performance of physical human-robot interaction enabled by this technology. Intellectual Merit This project will contribute efficient algorithms for optimal placement and data networking of distributed skin sensors on robots; new learning and control algorithms to sense human intent and improve interactivity; practical robotic skin and garment hardware with distributed sensors to include tactile, thermal imaging, and acceleration sensing in flexible materials that can be easily attached on and peeled off robots; and new metrics to evaluate the impact of this skin to humans including level of assistance, safety, ease of use, aesthetics, and therapeutic benefits. Broader Impacts Co-robots of the future will share their living spaces with humans, and, like people, will wear sensor skins and clothing that must be interconnected, fitted, cleaned, repaired, and replaced. In addition to aesthetic purposes that increase societal acceptance, these sensorized garments will also enhance robot perception of the environment, and enable extraordinary levels of safety, cooperation, and therapy for humans. The research proposed here will unlock near-term and also unforeseen applications of robotic skin with broad applicability, and especially to home assistance, medical rehabilitation, and prosthetics.

Project Start
Project End
Budget Start
2012-10-01
Budget End
2017-05-31
Support Year
Fiscal Year
2012
Total Cost
$1,349,766
Indirect Cost
Name
University of Texas at Arlington
Department
Type
DUNS #
City
Arlington
State
TX
Country
United States
Zip Code
76019