Facial expression communicates information about emotions, regulates interpersonal behavior and person perception, indexes brain functioning, and is essential to evaluating preverbal infants. Current human-observer methods of facial expression analysis, however, are labor intensive and difficult to standardize across laboratories and over time. These factors force investigators to use less specific systems whose convergent validity is often unknown. To make feasible more rigorous, quantitative measurement of facial expression in diverse applications, our interdisciplinary research group, with expertise in facial expression analysis and computerized image processing, will develop automated methods of facial expression analysis. Facial expressions of emotion will be videorecorded in a directed facial action task and emotion vignettes. For each subject, a 3-D electronic wireframe face model will be created by fitting a digitized image of the subject's neutral face to a basic model. Computerized feature detection and tracking procedures will use this face model in analyzing input video image sequences. Neural pattern recognition algorithms will measure and classify facial expression action units based on measurements from the feature tracking algorithms. A user interface will permit users to define facial configurations (per EMFACS, MAX, or their own specifications) and generate time series or summary data files for immediate analysis in SPSS or other statistical software. The automated method of facial expression analysis will eliminate the need for human observers in coding facial expression (greatly reducing coding time and personnel costs), promote standardized measurement, make possible the collection and processing of larger, more representative data sets, and open new areas of investigation and clinical application. For example, comparisons between automated and human-observer methods can inform how people process emotion expressions. Unlike methods requiring skilled observation, the automated method will be readily transferable from the laboratory tot he clinic for use in diagnostics and in analysis of patient communications.

Agency
National Institute of Health (NIH)
Institute
National Institute of Mental Health (NIMH)
Type
Research Project (R01)
Project #
1R01MH051435-01A3
Application #
2250696
Study Section
Special Emphasis Panel (SRCM (03))
Project Start
1995-08-01
Project End
1997-07-31
Budget Start
1995-08-01
Budget End
1996-07-31
Support Year
1
Fiscal Year
1995
Total Cost
Indirect Cost
Name
Mellon Pitts Corporation (Mpc Corp)
Department
Type
DUNS #
City
Pittsburgh
State
PA
Country
United States
Zip Code
15213
Girard, Jeffrey M; Cohn, Jeffrey F (2015) Automated Audiovisual Depression Analysis. Curr Opin Psychol 4:75-79
Dibeklio?lu, Hamdi; Hammal, Zakia; Yang, Ying et al. (2015) Multimodal Detection of Depression in Clinical Interviews. Proc ACM Int Conf Multimodal Interact 2015:307-310
Mattson, Whitney I; Cohn, Jeffrey F; Mahoor, Mohammad H et al. (2013) Darwin's Duchenne: eye constriction during infant joy and distress. PLoS One 8:e80161
Girard, Jeffrey M; Cohn, Jeffrey F; Mahoor, Mohammad H et al. (2013) Social Risk and Depression: Evidence from Manual and Automatic Facial Expression Analysis. Proc Int Conf Autom Face Gesture Recognit :1-8
Yang, Ying; Fairbairn, Catherine; Cohn, Jeffrey F (2013) Detecting Depression Severity from Vocal Prosody. IEEE Trans Affect Comput 4:142-150
Schmidt, K; Levenstein, R; Ambadar, Z (2012) Intensity of smiling and attractiveness as facial signals of trustworthiness in women. Percept Mot Skills 114:964-78
Messinger, Daniel S; Mattson, Whitney I; Mahoor, Mohammad H et al. (2012) The eyes have it: making positive expressions more positive and negative expressions more negative. Emotion 12:430-6
Zhu, Yunfeng; De la Torre, Fernando; Cohn, Jeffrey F et al. (2011) Dynamic Cascades with Bidirectional Bootstrapping for Action Unit Detection in Spontaneous Facial Behavior. IEEE Trans Affect Comput 2:79-91
Saragih, Jason M; Lucey, Simon; Cohn, Jeffrey F (2011) Real-time Avatar Animation from a Single Image. Proc Int Conf Autom Face Gesture Recognit :117-124
Lucey, Patrick; Cohn, Jeffrey F; Matthews, Iain et al. (2011) Automatically detecting pain in video through facial action units. IEEE Trans Syst Man Cybern B Cybern 41:664-74

Showing the most recent 10 out of 39 publications