Our recent studies examining visual fixation patterns to both naturalistic and ambiguous social stimuli in toddlers with autism spectrum disorders (ASD) revealed that viewing patterns are driven by the physical contingencies of the stimuli rather than by their social context. In this project, we adopt a stepwise, experimental approach to determine how physical and social contingencies influence and can be used to induce changes in visual attention in 12- to 24-month-old infants with ASD (N=70) relative to well-matched non-autistic developmentally delayed (DD) (N=70) and typically developing (TD) (N=60) infants. To do this, we create audio and video stimuli that display physical contingencies, in the form of audiovisual synchrony, and social contingencies, in the form of social context afforded by faces and speech. Using eye-tracking technology to measure visual fixation, we determine how manipulating these contingencies affects viewing patterns. The first specific aim is to establish whether infants can track physical contingencies, independent of any potential influence from social contingencies. We will determine the extent to which children with autism can follow basic patterns of audiovisual synchrony in co-presented audio and video stimuli, when there are no implicit cues from social context that might affect visual attention. Our results will show whether infants with ASD are more or less sensitive than TD or DD peers to specific forms of audiovisual information. The second specific aim is to investigate how social contingencies influence infant perception of physical contingencies. We will determine the extent to which cues for social context presented in either auditory or visual modalities affect the ability of children with autism to track patterns of audiovisual synchrony. The results of our experiments will indicate whether audiovisual perception in infants with ASD is more or less susceptible to specific audiovisual cues for social context than in TD or DD controls. The third specific aim is to explore whether changes in visual scanning in infants can be induced by manipulating physical and social contingencies. By dynamically modifying patterns of audiovisual synchrony in otherwise naturalistic social scenes, we will test whether visual attention in children with autism can be altered experimentally. Our results will demonstrate whether manipulation of physical and social contingencies may provide a mechanism for therapeutic repurposing of visual attention in children with autism. This project will provide unique insights into the ways in which individuals with autism search for meaning in their immediate sensory environment when confronted with social situations, and will also suggest possible avenues for early diagnosis and treatment. The long-term objective of this research is to develop the eye-tracking paradigm into a laboratory-based quantifier of social disability, the results of which can be exploited for tailoring effective individual remedial therapies. This project addresses several key action items of the NIH Interagency Autism Coordinating Committee, emphasizing developmental markers and screening in infants.
Our ability to diagnose and treat individuals with Autism Spectrum Disorders relies on the development of effective practical tools for quantifying and remedying the social and communicative disabilities that characterize this group. The research in this project will significantly further our understanding of the mechanisms underlying visual attention to social stimuli in infants with autism, and will show how quantification of visual scanning behavior in infants using eye-tracking technology may potentially provide tools for diagnosis and treatment. This research addresses several key action items of the NIH Interagency Autism Committee, emphasizing developmental markers and screening in infants, as well as neurodevelopmental processes, and is, therefore, highly relevant to public health.
|Vander Wyk, Brent C; Ramsay, Gordon J; Hudac, Caitlin M et al. (2010) Cortical integration of audio-visual speech and non-speech stimuli. Brain Cogn 74:97-106|