The goal of this research is to develop a system which automatically generates and animates conversations between multiple cooperative agents with appropriate and synchronized speech, intonation, facial expressions, and hand gestures. The research is based on theory which addresses relations and coordinations between these channels. The significance of this research is to provide a three-dimensional computer animation testbed for cooperative conversation theories. Human-machine interaction and training systems need more interactive and cooperative synthetic agents. Conversations are created by a dialogue planner that produces the text as well as the intonation of the utterances. The speaker/listener relationship, the content of text, the intonation and the undertaken actions all determine facial expressions, lip motions, eye gaze, head motion, and arm gesture generators. This project will focus on domains in which agents must propose and agree on abstract plans, and may have to motivate and carry out physical actions and refer to objects in their physical environment during conversation.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
9504372
Program Officer
Gary W Strong
Project Start
Project End
Budget Start
1995-06-01
Budget End
1998-12-31
Support Year
Fiscal Year
1995
Total Cost
$560,000
Indirect Cost
Name
University of Pennsylvania
Department
Type
DUNS #
City
Philadelphia
State
PA
Country
United States
Zip Code
19104