Difficulties in facial emotion recognition (FER) are thought to cause or exacerbate social disability in people with autism spectrum disorder (ASD) by preventing 1) accurate detection of social/emotional information conveyed through the face, particularly the eye-region, and 2) the deployment of emotionally appropriate responses. Although the neural systems thought to underlie FER deficits in ASD are increasingly appreciated, their plasticity remains speculative. The goal of this project is to develop an assistive technology to promote facial emotion recognition in ASD [R21]. We propose that FER can be rehabilitated using a brain-computer interface (BCI) device [R33]. To develop an FER assistant, we plan to first [R21] determine whether it is possible to develop a multi-voxel classifier that is temporally predictive of successful emotion recognition during functional magnetic resonance imaging (fMRI). An adaptive, real-time fMRI (rt-fMRI) paradigm will interpret the output of a subject's brain to assess whether a computer-generated actor's emotion is recognized. If not, the expressed facial emotion will be increased in intensity until the computer determines that the subject has recognized the emotion. After tuning this supervised learning algorithm produced by a support vector machine (SVM), we then transform the massively multidimensional classifier to low-dimensionality space, which can be replicated by a single- or dual-EEG sensor placed on the scalp. The proof of principle is that the multivariate classifier can be forward transformed into frequency (EEG) space. The EEG sensor can be comfortably worn outside of the scanner (BCI device), and can be wirelessly linked to a portable tablet (iPad). We will then demonstrate the feasibility of an ambulatory BCI 'FER assistant'[R33] in a between-group, randomized design (genuine neurofeedback vs placebo neurofeedback). The FER assistant is a virtual reality- based iPad application that uses the EEG sensor data to assist users with emotion recognition by manipulating the avatar's emotion intensity until it is recognized by the user, who will receive points the earlier the emotion is recognized. The purpose of this randomized controlled trial (RCT) is to assess feasibility including acceptability of the intervention, recruitment and randomization procedures, intervention implementation, blinded assessment procedures, and participant retention within the context of an RCT in preparation for a well- powered efficacy trial. This study's products include demonstration of the neural processes that underlie FER deficits and evidence of their plasticity, and an easily exportable, minimal-cost computer-based intervention. There has been little treatment research for this under-studied population, and social deficits may post unique challenges to people with ASD during late adolescence and early adulthood, as they face multiple life transitions and developmental tasks requiring social competence (e.g., securing employment). Ultimately, we plan to evaluate the efficacy of this emergent intervention in an adequately powered randomized clinical trial.
The social disability that characterizes Autism Spectrum Disorder (ASD) pervades other areas of adaptive behavior, is predictive of secondary mental health problems, and adversely affects long-term outcome. Although ASD is a chronic condition, there has been little research on interventions for adults with ASD to target social disability. We propose to first establish the neural plasticity of specific brain mechanisms underlying difficultis with facial emotion recognition, a core deficit believed to be pivotal in the behavioral expression of ASD-social disability, and subsequently develop a novel, computer-based intervention using real-time feedback to ameliorate emotion recognition deficits.
|White, Susan W; Richey, John A; Gracanin, Denis et al. (2015) The Promise of Neurotechnology in Clinical Translational Science. Clin Psychol Sci 3:797-815|