We will develop novel computer vision tools to reliably and precisely measure nonverbal social communication through quantifying communicative facial and bodily expressions. Our tools will be designed and developed in order to maximize their usability by non-engineer behavioral scientists, filling the enormous gap between engineering advances and their clinical accessibility. Significance: Social interaction inherently relies on perception and production of coordinated face and body expressions. Indeed, atypical face and body movements are observed in many disorders, impacting social interaction and communication. Traditional systems for quantifying nonverbal communication (e.g., FACS, BAP) require extensive training and coding time. Their tedious coding requirements drastically limits their scalability and reproducibility. While an extensive literature exists on advanced computer vision and machine learning techniques for face and body analysis, there is no well-established method commonly used in mental health community to quantify production of facial and bodily expressions or efficiently capture individual differences in nonverbal communication in general. As a part of this proposal, we will develop a computer vision toolbox including tools that are both highly granular and highly scalable, to allow for measurement of complex social behavior in large and heterogeneous populations. Approach: Our team will develop tools that provide granular metrics of nonverbal social behavior, including localized face and body kinematics, characteristics of elicited expressions, and imitation performance. Our tools will facilitate measurement of social communication both within a person and between people, to allow for assessment of individual social communication cues as well as those that occur within bidirectional social contexts. Preliminary Data: We have developed and applied novel computer vision tools to assess: (1) diversity of mouth motion during conversational speech (effect size d=1.0 in differentiating young adults with and without autism during a brief natural conversation), (2) interpersonal facial coordination (91% accuracy in classifying autism diagnosis in young adults during a brief natural conversation, replicated in an independent child sample), and (3) body action imitation (85% accuracy in classifying autism diagnosis based on body imitation performance). As apart of current proposal, we will develop more generic methods that can be used in normative and clinical samples.
Aims. In Aim 1, we will develop tools to automatically quantify fine-grained face movements and their coordination during facial expression production;
in Aim 2, we will develop tools to quantify body joint kinematics and their coordination during bodily expression production;
in Aim 3, we will demonstrate the tools? ability to yield dimensional metrics using machine learning. Impact: Our approach is designed for fast and rigorous assessment of nonverbal social communication, providing a scalable solution to measure individual variability, within a dimensional and transdiagnostic framework.

Public Health Relevance

This project develops novel tools for measuring nonverbal social communication as manifested through facial and bodily expressions. Using advanced computer vision and machine learning methodologies, we will quantify humans? communicative social behavior. The results of this project will impact public health by facilitating a rich characterization of normative development of social functioning, providing access to precise phenotypic information for neuroscience and genetics studies, and by measuring subtle individual differences to determine whether some interventions or treatments work better than others.

Agency
National Institute of Health (NIH)
Institute
National Institute of Mental Health (NIMH)
Type
Research Project (R01)
Project #
5R01MH122599-02
Application #
10150095
Study Section
Biomedical Computing and Health Informatics Study Section (BCHI)
Program Officer
Vaziri, Siavash
Project Start
2020-05-01
Project End
2025-02-28
Budget Start
2021-03-01
Budget End
2022-02-28
Support Year
2
Fiscal Year
2021
Total Cost
Indirect Cost
Name
Children's Hospital of Philadelphia
Department
Type
DUNS #
073757627
City
Philadelphia
State
PA
Country
United States
Zip Code
19146