A reliable and valid automated system for quantifying human affective behavior in ecologically important naturalistic environments would be a transformational tool for research and clinical practice. With NIMH support (MH R01-096951), we have made fundamental progress toward this goal. In the proposed project, we extend current capabilities in automated multimodal measurement of affective behavior (visual, acoustic, and verbal) to develop and validate an automated system for detecting the constructs of Positive, Aggressive, and Dysphoric behavior and component lower-level affective behaviors and verbal content. The system is based on the manual Living in Family Environments Coding System that has yielded critical findings related to developmental psychopathology and interpersonal processes in depression and other disorders. Two models will be developed. One will use theoretically-derived features informed by previous research in behavioral science and affective computing; the other empirically derived features informed by Deep Learning. The models will be trained in three separate databases of dyadic and triadic interaction tasks from over 1300 adolescent and adult participants from the US and Australia. Intersystem reliability with manual coding will be evaluated using k-fold cross-validation for both momentary and session level summary scores. Differences between models and in relation to participant factors will be tested using the general linear model. To ensure generalizability, we further will train and test between independent databases as well. To evaluate construct validity of automated coding, we will use the ample validity data available in the three databases to determine whether automated coding achieves the same or better pattern of findings with respect to depression risk and development. Following procedures already in place for sharing databases and software tools, we will design the automated systems for use by non-specialists and make them available for research and clinical use. Achieving these goals will provide behavioral science with powerful tools to examine basic questions in emotion, psychopathology, and interpersonal processes; and clinicians to improve assessment and ability to track change in clinical and interpersonal functioning over time. Relevance For behavioral science, automated coding of affective behavior from multimodal (visual, acoustic, and verbal) input will provide researchers with powerful tools to examine basic questions in emotion, psychopathology, and interpersonal processes. For clinical use, automated measurement will help clinicians to assess vulnerability and protective factors and response to treatment for a wide range of disorders. More generally, automated measurement would contribute to advances in intelligent tutors in education, training in social skills and persuasion in counseling, and affective computing more broadly.

Public Health Relevance

Observational methods of measuring affective behavior have yielded critical insights into emotion, socio-emotional development, and psychopathology. A persistent barrier to their wide application is that they are labor-intensive to learn and to use. Our interdisciplinary team of behavioral and computer scientists will develop and validate a fully automated system for measuring affective behavior from multimodal (face, gaze, body, voice, and speech) input for research and clinical use.

National Institute of Health (NIH)
National Institute of Mental Health (NIMH)
Research Project (R01)
Project #
Application #
Study Section
Social Psychology, Personality and Interpersonal Processes Study Section (SPIP)
Program Officer
Simmons, Janine M
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
University of Pittsburgh
Schools of Arts and Sciences
United States
Zip Code
Cohn, Jeffrey F; Okun, Michael S; Jeni, Laszlo A et al. (2018) Automated Affect Detection in Deep Brain Stimulation for Obsessive-Compulsive Disorder: A Pilot Study. Proc ACM Int Conf Multimodal Interact 2018:40-44
Dibeklioglu, Hamdi; Hammal, Zakia; Cohn, Jeffrey F (2018) Dynamic Multimodal Measurement of Depression Severity Using Deep Autoencoding. IEEE J Biomed Health Inform 22:525-536
Hammal, Zakia; Cohn, Jeffrey F; Wallace, Erin R et al. (2018) Facial Expressiveness in Infants With and Without Craniofacial Microsomia: Preliminary Findings. Cleft Palate Craniofac J 55:711-720
Kacem, Anis; Hammal, Zakia; Daoudi, Mohamed et al. (2018) Detecting Depression Severity by Interpretable Representations of Motion Dynamics. Proc Int Conf Autom Face Gesture Recognit 2018:739-745
Girard, Jeffrey M; C Wright, Aidan G (2018) DARMA: Software for dual axis rating and media annotation. Behav Res Methods 50:902-909
Girard, Jeffrey M; Chu, Wen-Sheng; Jeni, László A et al. (2017) Sayette Group Formation Task (GFT) Spontaneous Facial Expression Database. Proc Int Conf Autom Face Gesture Recognit 2017:581-588
Valstar, Michel F; Sánchez-Lozano, Enrique; Cohn, Jeffrey F et al. (2017) FERA 2017 - Addressing Head Pose in the Third Facial Expression Recognition and Analysis Challenge. Proc Int Conf Autom Face Gesture Recognit 2017:839-847
Wen-Sheng Chu; De la Torre, Fernando; Cohn, Jeffrey F (2017) Selective Transfer Machine for Personalized Facial Expression Analysis. IEEE Trans Pattern Anal Mach Intell 39:529-545
Hammal, Zakia; Chu, Wen-Sheng; Cohn, Jeffrey F et al. (2017) Automatic Action Unit Detection in Infants Using Convolutional Neural Network. Int Conf Affect Comput Intell Interact Workshops 2017:216-221
Chu, Wen-Sheng; De la Torre, Fernando; Cohn, Jeffrey F et al. (2017) A Branch-and-Bound Framework for Unsupervised Common Event Discovery. Int J Comput Vis 123:372-391

Showing the most recent 10 out of 31 publications