This award is funded under the American Recovery and Reinvestment Act of 2009 (Public Law 111-5).

The motivating problem of this research is to determine how to build computational models of expressive human movement for use in character animation applications. Satisfactory solutions to this problem must allow a high degree of control so that character movement can be customized for any context. This work will unify traditionally separate knowledge-driven and data-driven approaches to character animation, building on the control inherent in knowledge-driven techniques and the realism of motion capture data. A feature-based approach will be used to develop generative models. In this approach, a key-feature set will be determined in consultation with movement professionals, and professional performers working in a motion capture studio will provide data sampling the range of these features. Computational models of each feature will be developed from this data using a combination of procedural and learning techniques. The end goal is style-definition, in which explicit aspects of movement style can be represented computationally. This will support both movement analysis and movement generation through a software framework that allows each of these features to be combined and expressed. Key applications include models for conversational agents and a range of animation tools.

This work benefits society through the development of new computational models of expressive movement and by providing deeper insights into the nature of human motion. Computational models of movement that offer meaningful, fine-grained control are essential for a range of applications, including virtual worlds like Second Life, conversational agents, remote collaboration systems, training environments, games and other interactive, character based media. These models will be developed by combining two main trends in computer animation research, one that builds models based on explicit representations of existing knowledge and one that mines movement data to create models. The research will integrate computer scientists, digital artists and movement professionals, bringing a broad set of insights to technology development and providing cross-fertilization between these normally disparate groups. Research results will be published broadly and lead to new computational tools that can be used in a range of applications.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
0845529
Program Officer
Ephraim P. Glinert
Project Start
Project End
Budget Start
2009-09-01
Budget End
2014-08-31
Support Year
Fiscal Year
2008
Total Cost
$581,276
Indirect Cost
Name
University of California Davis
Department
Type
DUNS #
City
Davis
State
CA
Country
United States
Zip Code
95618