Our innovative and fully automated approach to the analysis of social behavior addresses the pressing need for precise and scalable measurements of the autism spectrum disorder (ASD) phenotype. Using computer vision and machine learning methods, we have created a novel, quantitative method for fine-grained analysis of social interactions. Our approach directly measures interpersonal motor synchrony, a construct which we use as a lens for understanding the social interaction differences that are at the core of ASD. Significance: Genomics and neuroimaging methods continuously evolve, providing deeper insights into the biology of ASD. However, methods for measuring the outward manifestations of ASD have not changed substantially in decades. ASD is fundamentally a disorder of social interaction, but current clinical tools do not directly measure observable social interactions. Instead, they summarize global impressions of these interactions via informant report questionnaires or observational coding schemes that typically lack the behavioral granularity needed to robustly measure individual differences and changes across time (e.g., treatment related change). Inter-rater agreement on questionnaires is typically modest, while the alternative ?deep phenotyping? by expert clinicians is a time- consuming and often cost-prohibitive burden to studies, especially when large samples are required (e.g., in genomics research). Approach: To resolve these problems, our team created a novel computational framework that leverages advances in markerless video motion capture, computer vision, and machine learning to directly capture dyadic social interactions. This allows us to capture all behaviors observable by expert clinicians but with exquisite digital precision and objectivity. Preliminary Data: We developed a fully automatic quantitative assessment of interpersonal social behavior focused on features of dyadic facial motor synchrony. When applied to videos of brief conversations between confederates and young adults with or without ASD, our assessment predicted diagnostic status with 91% accuracy ? significantly better than highly trained clinical experts assessing the same video recordings. The set of predictive social motor synchrony features that we identified also correlated significantly with symptom severity in the ASD group, suggesting that it can be used for both diagnostic classification and evaluating individual differences (vital for advancing precision medicine goals). Importantly, our findings were reproducible across samples: the same features identified in our adult analysis also predicted diagnosis in a child sample with high accuracy.
Aims. In Aim 1, we will test the specificity of our computer vision approach by expanding comparisons to include a mixed psychiatric disorder group;
Aim 2 will test dyadic synchrony in other body movements, and Aim 3 will define associations between interpersonal motor synchrony and dimensional aspects of social communication that span diagnostic categories. Impact: Our approach is designed for fast and rigorous assessment of social communication, providing a scalable solution to diagnosing ASD diagnosis and measuring individual variability, within a transdiagnostic, precision medicine framework.
This project tests a new technique for measuring social communication problems in children with autism spectrum disorder (ASD). Using markerless motion capture, computer vision, and machine learning, we will measure humans? natural tendency to coordinate their behavior with a conversational partner. The results of this project will impact public health by making it easier to reliably diagnose ASD, providing access to precise phenotypic information for genotypic studies, and by providing the theoretical and empirical foundation for an objective way to determine whether some treatments work better than others.