Computers have progressed from their origins as isolated rooms of electronic components to being distributed, highly connected, mobile personal devices that are increasingly intertwined with everyday human experience. But computers haven't yet permeated the domain that is most intuitive and essential for their users, namely that of three-dimensional space and naturalistic human movement. In this research the PI will test the hypothesis that graded whole-body tactile feedback can help humans learn or relearn important body postures and motions. To this end, she will augment commercial human motion tracking with a suit of modular tactile actuators (tactors) that provide spatially-registered naturalistic real-time feedback on the way in which each limb segment should be moved, emulating the light touch of a physical therapist, teacher, or coach. Through collaboration with a clinical researcher the PI will focus in this project on rehabilitation for apraxic stroke patients. Our current understanding of stroke indicates that these patients cannot accurately estimate the pose of their limbs when performing purposeful movements, so the PI will augment their motion practice with continuous tactile guidance about the 3D location and magnitude of any configuration errors. Determining the efficacy of this approach will advance our knowledge of healthy vs. impaired human motor control, and will also improve our understanding of the way in which humans process certain types of tactile signals. Development of the novel modular tactor system will provide insights on the effectiveness of voice-coil tactors and the range of sensations they can create. In the course of testing the project's primary hypothesis, the PI will employ human-subject experiments to determine which system design methods best succeed at helping stroke patients recover. The project will be organized into low- and high-level thrusts, each of which will be spearheaded by a doctoral student. The PI's prior work on haptic contact feedback will help her successfully lead this project, as will the support and resources of relevant experts at the University of Pennsylvania, at nearby Moss Rehabilitation Research Institute, and at Engineering Acoustics, Inc., a leading tactor company.

Broader Impacts: This project will have immediate relevance to stroke rehabilitation, with excellent potential for positive impact on society in the longer term through application to a variety of exciting topics in human-centered computing, especially computer-mediated scenarios in human motion guidance, such as athletic motion training and haptic virtual environments. The PI will strive to conduct this research so as to enhance its appeal to students from groups that are typically underrepresented in computer science and engineering, especially women.

Project Report

This research project investigated a new approach for helping people learn how to move their body in specific ways, such as for sports training or stroke rehabilitation. People typically learn such movements by practicing them over and over with the guidance of a coach or a therapist. We were interested in discovering whether computer technology could make this learning process more efficient, to augment the guidance a human can provide. We developed a series of systems that use tactile cues to try to guide a person's movements to follow a desired trajectory over time. We focused on motions of the arm, but the technology could also work for other body parts. Each system uses sensors to measure the person's movement and a computer program to calculate how far away they are from the desired motion. If the user is not close to the desired motion, the system uses one or more tactile actuators that the person is wearing to tell the person which way to move to correct their motion. Because they are inexpensive and widely available, most of the tactile actuators that we studied were eccentric rotating-mass motors that generate vibrations; cell phones use this same type of motor to deliver silent buzzing alerts. We focused on technology that is inexpensive and widely available so that our approach could be easily adopted by other researchers and therapists if it was successful. The main tactile motion guidance system that we created is shown in the first attached image. This is the third version of this system that we created during this project. It uses a Microsoft Kinect 360 to track the user's body movements, particularly the motion of one of the arms. A computer screen shows the movement that the user is making as well as a wireframe arm that is making a pre-recorded movement that the user is supposed to follow. When the user's upper-arm and/or lower-arm are outside the wireframe, the computer activates the vibrotactile actuator that is closest to the part of the arm that is outside the wireframe. The vibrotactile actuators are inside the two black fabric bands worn by the user. The vibrations tell the user to move away from that location, so that the arm goes back toward the desired trajectory. We had twenty-six healthy adults test this system both with and without the tactile feedback. We found that the tactile feedback helped subjects perform simple movements, such as flexing and extending the elbow, but it didn't help with more complicated movements, like throwing a ball. At the end of the project, we created a new and improved version of our full-arm motion guidance system. It was designed to address some of the shortcomings that we noticed when we tested the system described above. The new version consists of a set of wearable electronic modules that are about 3 cm by 3 cm by 1 cm. Some of the modules are in charge of tracking the motion of the user's arm, using inertial sensors, and the rest are in charge of delivering vibrotactile cues right on the skin of the user's arm. We have not yet tested the new version of this system with human subjects; when we do, we hope to discover whether the technical improvements help the user learn movements more quickly and accurately. The other system that we created was a tactile guidance system for wrist rotation. We focused on wrist rotation because we found this joint was particularly hard to track and guide using our other approaches. This system focused on exploring alternative tactile actuators. In addition to vibrations, we created wearable tactile actuators that tap on, drag across, squeeze, and twist the wrist of the user, as shown in the second attached image. These devices were made using low-cost servo motors and 3D-printed plastic parts. Ten human subjects tested the five devices, each with two algorithms. The best overall performance was by the actuator that repeatedly taps on the side of the user's wrist to show them which way to move. The intellectual merit of this project is in creating and testing a suite of new approaches to guiding human movement. We combined modern motion sensors and tactile actuators in clever ways to create systems that allow a human to interact with a computer in the very natural domain of 3D movement. We rigorously evaluated our solutions using human-subject experiments and published the results at academic conferences and in academic journals so that other researchers can learn from our findings. The broader impacts of this project are twofold. First, the technology that we studied has the potential to improve movement training for a wide variety of people, from athletes to stroke patients. Second, the students who worked on this project improved their skills at research and engineering.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
0915560
Program Officer
Ephraim P. Glinert
Project Start
Project End
Budget Start
2009-07-01
Budget End
2014-06-30
Support Year
Fiscal Year
2009
Total Cost
$532,000
Indirect Cost
Name
University of Pennsylvania
Department
Type
DUNS #
City
Philadelphia
State
PA
Country
United States
Zip Code
19104