Making contact with a person's body is critical to many of the most important caregiving tasks for people with physical disabilities. During these tasks, the forces applied by the robot to the body of the human client (care recipient) are of central importance. Yet robots are currently ignorant of what forces are appropriate for common tasks, and what forces are appropriate when making contact with different locations on the client's body. In this project, the PI's goal is to endow assistive robots with the ability to use appropriate forces when haptically interacting with people. To this end, he will capture and statistically model the forces applied when a person performs assistive tasks for him or herself, or provides care to another person. He will enable robots to intelligently regulate the forces they apply when performing assistive tasks, so that the applied forces are comparable to those used during human-human interactions. And he will enable clients to effectively control the forces applied by a robot during assistive tasks. Throughout the research, the PI will conduct experiments to test relevant hypotheses: That the type of task and the pose of the tool relative to the client's body are highly predictive of the force applied by a human caregiver; That when performing tasks on a mannequin, the robot will successfully emulate the forces observed during human-human interaction; That when the robot applies force to the client's body, the client will prefer that the robot use knowledge of the task and the pose of the tool to interpret user commands rather than a constant mapping. Because a person's ability to perform activities of daily living (ADLs) is highly predictive of his or her ability to live independently, the work will focus on four representative ADL tasks that require contact with the client's head: feeding a person yogurt, wiping a person's face, brushing a person's hair, and shaving a person with an electric razor. Project outcomes will include a system that enables a PR2 robot from Willow Garage to assist people with severe physical disabilities with these four tasks; the PR2 will be modified to have force-torque sensors at its wrists, specialized tools, and a Kinect 3D sensor on its head.

Broader Impacts: This research will begin to endow robots with a crucial form of "common sense" while quantitatively analyzing and synthesizing haptic interaction in the context of humans' most basic needs. It will also lead to a better understanding of human-robot interaction when the robot initiates contact with the user, and will contribute to data-driven methods for intelligent control. The PI will publish extensively and will release open source code, so that the work can catalyze progress towards robots that could empower millions of people to live more independently with a higher quality of life. The PI will directly involve people with disabilities in the research, and will actively engage the broader community by participating in events such as the RESNA conferences and the Atlanta Abilities Expo. In addition, he will incorporate research results in his biomechanics class and graduate course on haptics, and will then adapt the material to teach people around the world about these topics and robotics using the methods and tools of Khan Academy (www.khanacademy.org/).

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1150157
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2012-03-01
Budget End
2018-02-28
Support Year
Fiscal Year
2011
Total Cost
$499,996
Indirect Cost
Name
Georgia Tech Research Corporation
Department
Type
DUNS #
City
Atlanta
State
GA
Country
United States
Zip Code
30332