Modeling believable human characters in the virtual environment is crucial for unleashing the potential of computers as an interactive medium. A realistic depiction of human motion drastically enhances the perceptual immersion for task training, social communication, or tele-learning. In a believable virtual environment, a human character must be able to plan its locomotion, react to the environment, and engage other virtual characters or real humans with realistic display of emotion and personality. This daunting task requires simulation from the musculoskeletal system to the nervous system, as these intricate components must coordinate in synchrony when a functional activity is undertaken. Physics-based character animation today can simulate complex locomotion sequences with stunning visual realism, albeit under the careful guidance of the animator. What is missing from most character animation techniques is autonomous control, both conscious and unconscious. In this project, the investigator expands computer-generated character animation from a visualization tool to an interdisciplinary research area focused on autonomous control and realistic locomotion.

This project focuses on two inter-related research thrusts fundamentally aligned with the interdisciplinary nature of the investigator's research theme: (1) synthesis of non-ballistic, low-energy human locomotion, and (2) synthesis of interaction among multiple characters. Unlike ballistic and highly dynamic motion, non-ballistic motion is not stringently determined by physics, thus requiring more sophisticated models to address the complex interplay of unconscious autonomous control, such as physiology and emotion, and conscious autonomous control, such as the intents of the character. The autonomous control becomes even more crucial when multiple character interaction comes into play. Synthesis of interaction is challenging because the high-level artificial intelligence algorithms and low-level locomotion intertwine in an unpredictable manner when the interaction takes place. Beyond the scope of computer animation, this computational model can provide a device for recognizing emotion, diagnosing musculoskeletal anomalies, or validating biomechanical hypotheses.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Communication Foundations (CCF)
Type
Standard Grant (Standard)
Application #
0742302
Program Officer
Lawrence Rosenblum
Project Start
Project End
Budget Start
2007-07-01
Budget End
2013-06-30
Support Year
Fiscal Year
2007
Total Cost
$400,000
Indirect Cost
Name
Georgia Tech Research Corporation
Department
Type
DUNS #
City
Atlanta
State
GA
Country
United States
Zip Code
30332