Facial expression provides cues about emotional response, regulates interpersonal behavior, and communicates aspects of psychopathology. Human-observer based methods for measuring facial expression are labor intensive, qualitative, and difficult to standardize. Our interdisciplinary team of computer and behavioral scientists has developed the CMU/Pitt Automated Facial Image Analysis (AFA) system that is capable of automatically recognizing facial action units and analyzing their timing in facial behavior. The quantitative measurement achieved by AFA represents a major advance over manual and subjective measurement without requiring the use of invasive sensors. We envision to use AFA's reliable, valid, and efficient measurement of emotion expression and related nonverbal behavior for assessment of symptom severity in depression. Current methods of clinical assessment of depression depend almost entirely on verbal report (clinical interview and/or questionnaire). They lack systematic and efficient ways of incorporating behavioral observations that are .strong indicators of depressive symptoms, especially those related to the timing of dyadic interaction between clinician and patient, much of which may occur outside the awareness of either individual. AFA is capable of extracting both the type and timing of nonverbal indicators of depression. Our hypothesis is that quantitative measures of the configuration and timing of facial expression, head motion, and gaze obtainable by AFA will improve clinical assessment of symptom severity and evaluation of treatment outcomes when combined with information from interviews and self-report questionnaires. We propose to test this hypothesis in 40 participants participating in a treatment intervention study for major depression. Interview, questionnaire, and video data will be collected at regular intervals over the course of treatment. To measure social dynamics, both patient and interviewer will be video recorded and processed using AFA. Longitudinal multilevel modeling will be used to test study hypotheses. We will improve further algorithms and capabilities of AFA to meet evaluation goals and prepare AFA for use by the scientific and clinical community.

Agency
National Institute of Health (NIH)
Institute
National Institute of Mental Health (NIMH)
Type
Research Project (R01)
Project #
5R01MH051435-16
Application #
7817102
Study Section
Social Psychology, Personality and Interpersonal Processes Study Section (SPIP)
Program Officer
Freund, Michelle
Project Start
1995-08-01
Project End
2012-04-30
Budget Start
2010-05-01
Budget End
2012-04-30
Support Year
16
Fiscal Year
2010
Total Cost
$434,355
Indirect Cost
Name
University of Pittsburgh
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
004514360
City
Pittsburgh
State
PA
Country
United States
Zip Code
15213
Girard, Jeffrey M; Cohn, Jeffrey F (2015) Automated Audiovisual Depression Analysis. Curr Opin Psychol 4:75-79
Dibeklio?lu, Hamdi; Hammal, Zakia; Yang, Ying et al. (2015) Multimodal Detection of Depression in Clinical Interviews. Proc ACM Int Conf Multimodal Interact 2015:307-310
Mattson, Whitney I; Cohn, Jeffrey F; Mahoor, Mohammad H et al. (2013) Darwin's Duchenne: eye constriction during infant joy and distress. PLoS One 8:e80161
Girard, Jeffrey M; Cohn, Jeffrey F; Mahoor, Mohammad H et al. (2013) Social Risk and Depression: Evidence from Manual and Automatic Facial Expression Analysis. Proc Int Conf Autom Face Gesture Recognit :1-8
Yang, Ying; Fairbairn, Catherine; Cohn, Jeffrey F (2013) Detecting Depression Severity from Vocal Prosody. IEEE Trans Affect Comput 4:142-150
Schmidt, K; Levenstein, R; Ambadar, Z (2012) Intensity of smiling and attractiveness as facial signals of trustworthiness in women. Percept Mot Skills 114:964-78
Messinger, Daniel S; Mattson, Whitney I; Mahoor, Mohammad H et al. (2012) The eyes have it: making positive expressions more positive and negative expressions more negative. Emotion 12:430-6
Zhu, Yunfeng; De la Torre, Fernando; Cohn, Jeffrey F et al. (2011) Dynamic Cascades with Bidirectional Bootstrapping for Action Unit Detection in Spontaneous Facial Behavior. IEEE Trans Affect Comput 2:79-91
Saragih, Jason M; Lucey, Simon; Cohn, Jeffrey F (2011) Real-time Avatar Animation from a Single Image. Proc Int Conf Autom Face Gesture Recognit :117-124
Lucey, Patrick; Cohn, Jeffrey F; Matthews, Iain et al. (2011) Automatically detecting pain in video through facial action units. IEEE Trans Syst Man Cybern B Cybern 41:664-74

Showing the most recent 10 out of 39 publications