Facial expression provides sensitive cues about emotional response and plays a critical role in the regulation of interpersonal behavior. Human-observer based methods for measuring facial expression are labor intensive, qualitative, and difficult to standardize across laboratories and over time. To make feasible more rigorous, quantitative measurement of facial expression in diverse applications, the investigators formed an interdisciplinary research group that covers expertise in facial expression analysis and computer vision. In August 1995, the research group received initial funding for Facial Expression Analysis by Computer Image Processing (NIMH #IRO1MH51435). With NIMH support, they created a large representative database for method development and developed and validated a face analysis system that tracks gaze and facial features in digitized image sequences of infant, child, and adult subjects. In near frontal views, the Face Analysis System has achieved concurrent validity with manual FACS coding for 16 of 44 FAGS action units and more than 30 combinations in which they occur. For the competing renewal, the investigators will (1) Increase system robustness to head orientation and head motion (2) Increase the number of recognizable action units to include all 30 of those that have a specific anatomic basis, and (3) Conduct a systematic, large-scale, comprehensive test of the developed Face Analysis System by using multiple databases. These databases encompass subjects of various ages and backgrounds and varying types of emotion induction, head orientation, head motion, size of the face in pixels, and presence of speech. The databases were initially collected to answer substantive questions about emotion processes; they represent the types of data that the Face Analysis System will encounter in psychology research and clinical applications.

Agency
National Institute of Health (NIH)
Institute
National Institute of Mental Health (NIMH)
Type
Research Project (R01)
Project #
2R01MH051435-07
Application #
6287970
Study Section
Special Emphasis Panel (ZRG1-RPHB-4 (01))
Program Officer
Huerta, Michael F
Project Start
1995-08-01
Project End
2006-04-30
Budget Start
2001-05-01
Budget End
2002-04-30
Support Year
7
Fiscal Year
2001
Total Cost
$334,747
Indirect Cost
Name
University of Pittsburgh
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
053785812
City
Pittsburgh
State
PA
Country
United States
Zip Code
15213
Girard, Jeffrey M; Cohn, Jeffrey F (2015) Automated Audiovisual Depression Analysis. Curr Opin Psychol 4:75-79
Dibeklio?lu, Hamdi; Hammal, Zakia; Yang, Ying et al. (2015) Multimodal Detection of Depression in Clinical Interviews. Proc ACM Int Conf Multimodal Interact 2015:307-310
Mattson, Whitney I; Cohn, Jeffrey F; Mahoor, Mohammad H et al. (2013) Darwin's Duchenne: eye constriction during infant joy and distress. PLoS One 8:e80161
Girard, Jeffrey M; Cohn, Jeffrey F; Mahoor, Mohammad H et al. (2013) Social Risk and Depression: Evidence from Manual and Automatic Facial Expression Analysis. Proc Int Conf Autom Face Gesture Recognit :1-8
Yang, Ying; Fairbairn, Catherine; Cohn, Jeffrey F (2013) Detecting Depression Severity from Vocal Prosody. IEEE Trans Affect Comput 4:142-150
Schmidt, K; Levenstein, R; Ambadar, Z (2012) Intensity of smiling and attractiveness as facial signals of trustworthiness in women. Percept Mot Skills 114:964-78
Messinger, Daniel S; Mattson, Whitney I; Mahoor, Mohammad H et al. (2012) The eyes have it: making positive expressions more positive and negative expressions more negative. Emotion 12:430-6
Zhu, Yunfeng; De la Torre, Fernando; Cohn, Jeffrey F et al. (2011) Dynamic Cascades with Bidirectional Bootstrapping for Action Unit Detection in Spontaneous Facial Behavior. IEEE Trans Affect Comput 2:79-91
Saragih, Jason M; Lucey, Simon; Cohn, Jeffrey F (2011) Real-time Avatar Animation from a Single Image. Proc Int Conf Autom Face Gesture Recognit :117-124
Lucey, Patrick; Cohn, Jeffrey F; Matthews, Iain et al. (2011) Automatically detecting pain in video through facial action units. IEEE Trans Syst Man Cybern B Cybern 41:664-74

Showing the most recent 10 out of 39 publications