With the most recent Centers for Disease Control and Prevention (CDC) prevalence estimates for children with ASD at 1 in 88, effective early identification and treatment is often characterized as a public health emergency. Given the present limits of intervention science and the enormous costs of the disorder across the lifespan, there is an urgent need for more efficacious treatments that can enhance intervention outcomes. A growing number of studies have investigated the application of advanced interactive technologies to ASD intervention, including computer technology, virtual reality (VR) environments, and more recently, robotic systems. The primary goal of the proposed research is the design and preliminary testing of a robust robotic intervention platform and environment specifically designed to accelerate improvements in early areas of core ASD deficit. In particular, it focuses on developing a robotic intervention system (ARIA: Adaptive Robot-mediated Intervention Architecture) capable of seamlessly integrating real-time, non-invasive detection of gaze with an intelligent environment endowed with the ability to autonomously alter system function based on child performance to impact early joint attention skills, thought to be fundamental social communication building blocks central to etiology and treatment of ASD. While it is unlikely that technological advances will take the place of traditional intervention paradigms, we hypothesize that adaptive robotic systems may hold great value as potential accelerant technologies that enhance learning intervention modalities in potent ways, particularly for children who show powerful differences in their contingent responses to non-biological actions and events relative to interactions with social partners at very early ages. In the proposed research, we will investigate the realistic potential of robotic technology for young children with ASD via explicit design and tests of such a system to improve performance within the domain of early joint attention skills. Thus, the specific aims and milestones of the R21 phase of our project focus on: 1) Achieving integration of noncontact gaze technology into the system, 2) realizing autonomous closed-loop operation of our system architecture, and 3) demonstrating adequate initial user functioning while interacting with the autonomous system. Subsequent to attaining these milestones and demonstrating initial system capacity, the R33 phase of our project will assess the performance of the robot-mediated architecture during a pilot intervention experiment. This trial will specifically evaluate system capacities as related t very young children with ASD as well as feasibility data related to recruitment and retention in the protocol, user tolerance, and patient satisfaction. This pilot trial will also assess within system improvement in joint attention skills, as well as a methodology for assessing potential generalization of such skills in a larger trial.
The project will develop, apply, and via initial pilot study examine the feasibility of a novel adaptive robotic technology as a potential intervention tool for young children with Autism Spectrum Disorders (ASD). The proposed robotic system is designed as an 'intelligent'system that automatically adjusts intervention tasks based on information about where the child is looking (e.g., non-invasive eye gaze) in order to enhance performance on tasks of early social orienting. It is believed that the successful development and application of this new technology has the potential to accelerate movement towards accessible, personalized, targeted robot-based ASD intervention of relevance to the core neurodevelopmental deficits of the disorder.