Children with autism spectrum disorder (ASD) have often been observed to express affect either weakly, only in one modality at a time (e.g., choice of words) or in multiple modalities but not in a coordinated fashion. These difficulties in crossmodal integration of affect expression may have roots in certain global characteristics of brain structure in autism, specifically atypical interconnectivity between brain areas. Poor crossmodal integration of affect expression may also play a critical role in communications difficulties that are well documented in ASD. Not understanding how e.g., facial expression can be used to modify the interpretation of words undermines social reciprocity. Impairment in crossmodal integration of affect is thus a potentially powerful explanatory concept in ASD. The study will provide much needed data on expressive crossmodal integration impairment in ASD and its association with receptive croosmodal integration impairment, using innovative technologies to create stimuli for a judgmental procedure that makes possible independent assessment of the individual modalities;these technologies are critical because human observers are not able to selectively filter out modalities. In addition, the vocal measures and the audiovisual database lay the essential groundwork for the next step: Creation of audiovisual analysis methods for automated assessment of expressive crossmodal integration. These methods will be applied to audio-visual recordings of a structured play situation;the child will participate in this play situation twice, once with a caregiver and once with an examiner. This procedure for measuring expressive crossmodal integration will be complemented by a procedure for measuring crossmodal integration of affect processing using dynamic talking-face stimuli in which the audio and video stream are recombined (preserving perfect synchrony of the facial and vocal channels) to create stimuli with congruent vs. incongruent affect expression. Both procedures will be applied to three groups: Children with ASD, children with Developmental Language Disorder (DLD), and typically developing children;ages will be six to ten. Our study would be the first to perform a comprehensive analysis of crossmodal integration of affect expression in ASD. If the study confirms the existence of these impairments in ASD, and provides a detailed picture of these impairments, this could (i) guide brain studies to specifically target areas responsible for affect expression;(i) provide a deeper understanding of impairments in social reciprocity;and (ii) help design remedial programs for intensive training of under-used or incoordinated expressive modalities. The study this contributes to etiology diagnosis, and treatment.
Children with autism spectrum disorder (ASD) have often been observed to express affect either weakly, only in one modality at a time (e.g., choice of words), or in multiple modalities but not in a coordinated fashion. However, studies conducted to date have not conclusively determined whether crossmodal integration of affect expression indeed is impaired in ASD and what its detailed behavioral characteristics are. The goal of the project is to detail crossmodal integration of affect expression in ASD (with typical children and children with developmental language disorder as comparison groups) using an innovative method in which we independently measure affect expression in four expressive modalities (i.e., facial, gestural, vocal- prosody, and vocal-content) and quantitatively determine discrepancies between how affect is expressed in the different modalities.
|van Santen, Jan P H; Sproat, Richard W; Hill, Alison Presmanes (2013) Quantifying repetitive speech in autism spectrum disorders and language impairment. Autism Res 6:372-83|