Facial expression has been a focus of emotion research for over a hundred years. In recent decades observations of facial expressions have yielded critical and dramatic insights about the etiology of psychopathology, and have proven capable of predicting treatment outcomes (see Ekman &Rosenberg, 2005). Despite these original striking findings, there has been surprisingly little follow-up work. The primary reason fr the lack of sustained research is that the most reliable manual systems for measuring facial expression often require considerable training and are labor intensive. Automated measurement using computer vision and machine learning seeks to address the need for valid, efficient, and reproducible measurement. Recent systems have shown promise in fairly small studies using posed behavior or structured contexts with confederates, or trained interviewers, or pre-trained (person-specific) face models. For automated coding to be applied in real-world settings, a large data base with ample variability in pose, head motion, skin color, gender, partial occlusion, and expression intensity is needed. We have developed a unique database that meets this need and the algorithms necessary to enable robust automated coding. The database consists of 720 participants in three-person groups engaged in a group formation task. In a preliminary study, we demonstrated that our algorithms can successfully code two key facial signals associated with human emotion in this relatively unconstrained context (Cohn &Sayette, 2010). To achieve efficient, accurate, and valid measurement of facial expression usable in research and clinical settings, we aim to 1) train and validate classifiers to achieve reliable facial expression detectin across this unprecedentedly large, diverse data set;2) extend the previous person-specific methods to person-independent (generic) facial feature detection, tracking, and alignment;and 3) make these tools available for research and clinical use.

Public Health Relevance

The project has two target application domains. For behavioral science, automated facial expression analysis will provide researchers with powerful tools to examine basic questions in emotion and interpersonal processes, as well as emotion processes underlying diverse forms of psychopathology and neurologic disorder. For clinical use, automated facial expression analysis will help clinicians to assess vulnerability and protective factors and objectively evaluate course of treatment across a wide range of disorders including major depression, bipolar disorder, schizophrenia, anxiety, addiction, suicide risk, and pain.

Agency
National Institute of Health (NIH)
Institute
National Institute of Mental Health (NIMH)
Type
Research Project (R01)
Project #
5R01MH096951-02
Application #
8464280
Study Section
Social Psychology, Personality and Interpersonal Processes Study Section (SPIP)
Program Officer
Simmons, Janine M
Project Start
2012-05-01
Project End
2017-02-28
Budget Start
2013-03-01
Budget End
2014-02-28
Support Year
2
Fiscal Year
2013
Total Cost
$497,842
Indirect Cost
$95,421
Name
University of Pittsburgh
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
004514360
City
Pittsburgh
State
PA
Country
United States
Zip Code
15213
Cohn, Jeffrey F; Okun, Michael S; Jeni, Laszlo A et al. (2018) Automated Affect Detection in Deep Brain Stimulation for Obsessive-Compulsive Disorder: A Pilot Study. Proc ACM Int Conf Multimodal Interact 2018:40-44
Dibeklioglu, Hamdi; Hammal, Zakia; Cohn, Jeffrey F (2018) Dynamic Multimodal Measurement of Depression Severity Using Deep Autoencoding. IEEE J Biomed Health Inform 22:525-536
Hammal, Zakia; Cohn, Jeffrey F; Wallace, Erin R et al. (2018) Facial Expressiveness in Infants With and Without Craniofacial Microsomia: Preliminary Findings. Cleft Palate Craniofac J 55:711-720
Kacem, Anis; Hammal, Zakia; Daoudi, Mohamed et al. (2018) Detecting Depression Severity by Interpretable Representations of Motion Dynamics. Proc Int Conf Autom Face Gesture Recognit 2018:739-745
Girard, Jeffrey M; C Wright, Aidan G (2018) DARMA: Software for dual axis rating and media annotation. Behav Res Methods 50:902-909
Girard, Jeffrey M; Chu, Wen-Sheng; Jeni, László A et al. (2017) Sayette Group Formation Task (GFT) Spontaneous Facial Expression Database. Proc Int Conf Autom Face Gesture Recognit 2017:581-588
Valstar, Michel F; Sánchez-Lozano, Enrique; Cohn, Jeffrey F et al. (2017) FERA 2017 - Addressing Head Pose in the Third Facial Expression Recognition and Analysis Challenge. Proc Int Conf Autom Face Gesture Recognit 2017:839-847
Wen-Sheng Chu; De la Torre, Fernando; Cohn, Jeffrey F (2017) Selective Transfer Machine for Personalized Facial Expression Analysis. IEEE Trans Pattern Anal Mach Intell 39:529-545
Hammal, Zakia; Chu, Wen-Sheng; Cohn, Jeffrey F et al. (2017) Automatic Action Unit Detection in Infants Using Convolutional Neural Network. Int Conf Affect Comput Intell Interact Workshops 2017:216-221
Chu, Wen-Sheng; De la Torre, Fernando; Cohn, Jeffrey F et al. (2017) A Branch-and-Bound Framework for Unsupervised Common Event Discovery. Int J Comput Vis 123:372-391

Showing the most recent 10 out of 31 publications