Facial expression communicates information about emotional response and plays a critical role in the regulation of interpersonal behavior. Current human-observer based methods for measuring facial expression are labor intensive, qualitative, and difficult to standardize across laboratories and over time. To make feasible more rigorous, quantitative measurement of facial expression in diverse applications, we formed an interdisciplinary research group which covers expertise in facial expression analysis and image processing. In the funding period, we developed and demonstrated the first version of an automated system for measuring facial expression in digitized images. The system can discriminate nine combinations of FACS action units in the upper and lower face, quantity the timing and topography of action unit intensity in the brow region; and geometrically normalize image sequences within a range of plus or minus 20 degrees of out of-plane. In the competing renewal, we will increase the number of action unit combinations that are recognized, implement convergent methods of quantifying action unit intensity, increase the generalizability of action unit estimation to a wider range of image orientations, test facial image processing (FIP) in image sequences from directed facial action tasks and laboratory studies of emotion regulation, and facilitate the integration of FIP into existing data management and statistical analysis software for use by behavioral science researchers and clinicians. With these goals completed, FIP will eliminate the need for human observers in coding facial expression, promote standardize measurement, make possible the collection and processing of larger, more representative data sets, and open new areas of investigation and clinical application.

Agency
National Institute of Health (NIH)
Institute
National Institute of Mental Health (NIMH)
Type
Research Project (R01)
Project #
5R01MH051435-04
Application #
2675154
Study Section
Social and Group Processes Review Committee (SGP)
Project Start
1995-08-01
Project End
2001-04-30
Budget Start
1998-05-05
Budget End
1999-04-30
Support Year
4
Fiscal Year
1998
Total Cost
Indirect Cost
Name
Mellon Pitts Corporation (Mpc Corp)
Department
Type
DUNS #
City
Pittsburgh
State
PA
Country
United States
Zip Code
15213
Girard, Jeffrey M; Cohn, Jeffrey F (2015) Automated Audiovisual Depression Analysis. Curr Opin Psychol 4:75-79
Dibeklio?lu, Hamdi; Hammal, Zakia; Yang, Ying et al. (2015) Multimodal Detection of Depression in Clinical Interviews. Proc ACM Int Conf Multimodal Interact 2015:307-310
Mattson, Whitney I; Cohn, Jeffrey F; Mahoor, Mohammad H et al. (2013) Darwin's Duchenne: eye constriction during infant joy and distress. PLoS One 8:e80161
Girard, Jeffrey M; Cohn, Jeffrey F; Mahoor, Mohammad H et al. (2013) Social Risk and Depression: Evidence from Manual and Automatic Facial Expression Analysis. Proc Int Conf Autom Face Gesture Recognit :1-8
Yang, Ying; Fairbairn, Catherine; Cohn, Jeffrey F (2013) Detecting Depression Severity from Vocal Prosody. IEEE Trans Affect Comput 4:142-150
Schmidt, K; Levenstein, R; Ambadar, Z (2012) Intensity of smiling and attractiveness as facial signals of trustworthiness in women. Percept Mot Skills 114:964-78
Messinger, Daniel S; Mattson, Whitney I; Mahoor, Mohammad H et al. (2012) The eyes have it: making positive expressions more positive and negative expressions more negative. Emotion 12:430-6
Zhu, Yunfeng; De la Torre, Fernando; Cohn, Jeffrey F et al. (2011) Dynamic Cascades with Bidirectional Bootstrapping for Action Unit Detection in Spontaneous Facial Behavior. IEEE Trans Affect Comput 2:79-91
Saragih, Jason M; Lucey, Simon; Cohn, Jeffrey F (2011) Real-time Avatar Animation from a Single Image. Proc Int Conf Autom Face Gesture Recognit :117-124
Lucey, Patrick; Cohn, Jeffrey F; Matthews, Iain et al. (2011) Automatically detecting pain in video through facial action units. IEEE Trans Syst Man Cybern B Cybern 41:664-74

Showing the most recent 10 out of 39 publications