Emotion is the complex psycho-physiological experience of an individual's state of mind. It affects every aspect of rational thinking, learning, decision making, and psychomotor ability. Emotion modeling and recognition is playing an increasingly important role in many research areas, including human computer interaction, robotics, artificial intelligence, and advanced technologies for education and learning. Current emotion-related research, however, is impeded by a lack of a large spontaneous emotion data corpus. With few exceptions, emotion databases are limited in terms of size, sensor modalities, labeling, and elicitation methods. Most rely on posed emotions, which may bear little resemblance to what occurs in the contexts wherein the emotions are really triggered. In this project the PIs will address these limitations by developing a multimodal and multidimensional corpus of dynamic spontaneous emotion and facial expression data, with labels and feature derivatives, from approximately 200 subjects of different ethnicities and ages, using sensors of different modalities. To these ends, they will acquire a 6-camera wide-range 3D dynamic imaging system to capture ultra high-resolution facial geometric data and video texture data, which will allow them to examine the fine structure change as well as the precise time course for spontaneous expressions. Video data will be accompanied by other sensor modalities, including thermal, audio and physiological sensors. An IR thermal camera will allow real time recording of facial temperature, while an audio sensor will record the voices of both subject and experimenter. The physiological sensor will measure skin conductivity and related physiological signals. Tools and methods to facilitate and simplify use of the dataset will be provided. The entire dataset, including metadata and associated software, will be stored in a public depository and made available for research in computer vision, affective computing, human computer interaction, and related fields.

Intellectual Merit This research will involve construction of a corpus of spontaneous multi-dimensional and multimodal emotion and facial expression data, which is significantly larger than any that currently exist. To elicit natural and spontaneous emotions from subjects, the PIs will employ five approaches using physical experience, film clips, cold pressor, relived memories tasks, and interview formats. The database will employ sensors of different modalities including high resolution 2D/3D video cameras, infrared thermal cameras, audio sensors, and physiological sensors. The video data will be labeled according to a number of categories, including AU labeling and emotion labeling from self-report and perceptual judgments of naïve observers. Comprehensive emotion labeling will include dimensional approaches (e.g., valence, arousal), discrete emotions (e.g., joy, anger, smile controls), anatomic methods (e.g., FACS), and paralinguistic signaling (e.g., back-channeling). Additional features will be derived from the raw data, including 2D/3D facial feature points, head pose, and audio parameters.

Broader Impact Project outcomes will immediately benefit researchers in computer vision and emotion modeling and recognition, because the database will allow them to train and validate their facial expression and emotion recognition algorithms. The new corpus will facilitate the study of multimodal fusion from audio, video, geometric, thermal, and physical responses. It will contribute to the development of a comprehensive understanding of mechanisms involving human behavior, and will allow enhancements to human computer interaction (e.g., through emotion-sensitive and socially intelligent interfaces), robotics, artificial intelligence, and cognitive science. The work will likely also significantly impact research in diverse other fields such as psychology, biometrics, medicine/life science, law-enforcement, education, entrainment, and social science.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Type
Standard Grant (Standard)
Application #
1205664
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2012-09-01
Budget End
2016-08-31
Support Year
Fiscal Year
2012
Total Cost
$306,800
Indirect Cost
Name
Suny at Binghamton
Department
Type
DUNS #
City
Binghamton
State
NY
Country
United States
Zip Code
13902