Through our hands, we communicate, care for ourselves and others, and use tools to affect our world. However, our understanding of how we control our hands to accomplish such feats is still in its infancy. Dexterous manipulation is especially challenging to study, as even "simple" manipulation tasks are complex in their details. The task of lifting a wrench into the hand, for example, can be decomposed into five separate actions, each of which may require a specialized control strategy. It is important to study human examples of such activities; through understanding human expertise we can reach practical outcomes such as physically intelligent animated characters for training and remote communication, robots capable of unprecedented dexterity, and prosthetic designs that far exceed the current state of the art in their elegance and functionality.

The key hurdle in making a significant advance in these areas is the difficulty of capturing human manipulation in a form that facilitates analysis and study. The rapid sequence of contact events, the hand's large number of degrees of freedom, and close contact between the hand and object all contribute to creating an impossible capture task using traditional methods. The investigators are developing an alternative: simulation motion capture, where a user interacts with and guides a running simulation. Through this innovative approach, details such as contact timing, contact area, and contact forces are made available for the first time for general manipulation tasks. Key innovations include a fast simulation system for a deformable human hand (or full body) and novel techniques to control such a high degree of freedom simulation with intent and precision. In parallel, the investigators create a database of manipulation tasks, study new languages for action segmentation and control law development, develop robust autonomous controllers for grasping and manipulation, and study novel classifiers for recognition of affordances.

Project Report

The goal of this project was to investigate new strategies for capturing the motion of the human hand during grasping and manipulation. Motions involving interactions with objects and the environment using our hands are traditionally very difficult to capture due to the complexity of the human hand, the complexity of the motions that may be involved, and the fact that parts of the hand and points of interaction are often obscured from view and otherwise difficult to measure. Yet, capturing and understanding grasping and manipulation is extremely important for creating animated characters that can interact with objects in a convincing manner as needed for applications such as training and educational virtual environments. Our solution to this problem was to develop a fast simulation system for capturing physics based motions and interactions in real-time. Some advantages of capturing motions in simulation are the ability to go back and restart from any point in order to improve results, and the availability of detailed information about contact points and contact forces throughout task execution. Three primary project outcomes will be highlighted. First, we have developed a protytpe system for capturing hand motions that is supported by a novel real-time simulation system. Snapshots from our system can be seen in Figure 1. Second, we have created a novel interface for driving the simulation (or a real robot) from a touch screen device such as the iPad (Figure 2). We find that a touchscreen interface can be a surprisingly successful capture device. Tracking the fingertips is a powerful mechanism for controlling tabletop manipulations, overhead grasps, and other interactions where the active fingers are nearly planar. Third, we have found that understanding the physics of object behavior is critical for being able to accurately predict the success of actions such as grasping. This finding is important for any application where real-world actions are considered and planned in advance. The ability to capture grasping and manipulation motions in a fast, inexpensive, and reliable manner has implications beyond character animation. For example, our findings can be useful in control and design of dexterous robots, and for improving our understanding of human dexterity. Our results have already been applied to controlling robots with dexterous hands (Figure 2) and to the development of successful planners for robotic grasping (Figure 3).

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1145640
Program Officer
Lawrence Rosenblum
Project Start
Project End
Budget Start
2011-08-01
Budget End
2013-07-31
Support Year
Fiscal Year
2011
Total Cost
$81,000
Indirect Cost
Name
Carnegie-Mellon University
Department
Type
DUNS #
City
Pittsburgh
State
PA
Country
United States
Zip Code
15213