Children with autism struggle to recognize facial expressions, make eye contact, and engage in social interactions. Many of these children can have dramatic recoveries, particularly if social skills are taught from an early age. However, in today's healthcare system the delivery of the behavioral intervention is bottlenecked by a sharp and increasing imbalance in the number of behavioral therapists and the number of children in need of care. As such, there is an urgent need to develop mobilized methods of care delivery. We have developed an artificial intelligence tool for automatic facial expression recognition that runs on Google Glass through an Android app and delivers instantaneous social cues to individuals with autism in their natural environment, providing therapy that today is given only by clinicians in non-scalable person-to- person sessions. The system leverages Glass's outward facing camera to read a person's facial expressions and passes facial landmarks to an Android native app for immediate machine learning-based emotion classification. The system then gives the child wearer real-time social cues and records social responses. We believe that the system's ability to provide continuous behavioral therapy outside of clinical settings will enable dramatically faster gains in social acuity that will, within a limited and self-directed period of use, permit the child to engage in social settings on his/her own. This proposal outlines three main aims needed to test, refine and optimize the tools and a series of validation experiments needed to bring our system from prototype to a viable clinical tool that every family can use regularly from home for precision healthcare.

Public Health Relevance

{See instructions): We have developed a combined software-hardware solution built on top of Google Glass that enables real lime expression recognition and field-of-view eye tracking to capture social interaction data and give guiding social cues to an individual with Autism. This project will test and optimize the potential of this tool to provide continuous and naturalized behavioral therapy to children with autism from their homes.

Agency
National Institute of Health (NIH)
Institute
National Institute of Biomedical Imaging and Bioengineering (NIBIB)
Type
Research Project (R01)
Project #
1R01EB025025-01
Application #
9394169
Study Section
Special Emphasis Panel (ZRG1)
Program Officer
Lash, Tiffani Bailey
Project Start
2017-07-05
Project End
2020-03-31
Budget Start
2017-07-05
Budget End
2018-03-31
Support Year
1
Fiscal Year
2017
Total Cost
Indirect Cost
Name
Stanford University
Department
Pediatrics
Type
Schools of Medicine
DUNS #
009214214
City
Stanford
State
CA
Country
United States
Zip Code
94304
Paskov, Kelley M; Wall, Dennis P (2018) A Low Rank Model for Phenotype Imputation in Autism Spectrum Disorder. AMIA Jt Summits Transl Sci Proc 2017:178-187
Daniels, Jena; Haber, Nick; Voss, Catalin et al. (2018) Feasibility Testing of a Wearable Behavioral Aid for Social Learning in Children with Autism. Appl Clin Inform 9:129-140
Levy, Sebastien; Duda, Marlena; Haber, Nick et al. (2017) Sparsifying machine learning models identify stable subsets of predictive features for behavioral detection of autism. Mol Autism 8:65