With the most recent Centers for Disease Control and Prevention (CDC) prevalence estimates for children with ASD at 1 in 88, effective early identification and treatment is often characterized as a public health emergency. Given the present limits of intervention science and the enormous costs of the disorder across the lifespan, there is an urgent need for more efficacious treatments that can enhance intervention outcomes. A growing number of studies have investigated the application of advanced interactive technologies to ASD intervention, including computer technology, virtual reality (VR) environments, and more recently, robotic systems. The primary goal of the proposed research is the design and preliminary testing of a robust robotic intervention platform and environment specifically designed to accelerate improvements in early areas of core ASD deficit. In particular, it focuses on developing a robotic intervention system (ARIA: Adaptive Robot-mediated Intervention Architecture) capable of seamlessly integrating real-time, non-invasive detection of gaze with an intelligent environment endowed with the ability to autonomously alter system function based on child performance to impact early joint attention skills, thought to be fundamental social communication building blocks central to etiology and treatment of ASD. While it is unlikely that technological advances will take the place of traditional intervention paradigms, we hypothesize that adaptive robotic systems may hold great value as potential accelerant technologies that enhance learning intervention modalities in potent ways, particularly for children who show powerful differences in their contingent responses to non-biological actions and events relative to interactions with social partners at very early ages. In the proposed research, we will investigate the realistic potential of robotic technology for young children with ASD via explicit design and tests of such a system to improve performance within the domain of early joint attention skills. Thus, the specific aims and milestones of the R21 phase of our project focus on: 1) Achieving integration of noncontact gaze technology into the system, 2) realizing autonomous closed-loop operation of our system architecture, and 3) demonstrating adequate initial user functioning while interacting with the autonomous system. Subsequent to attaining these milestones and demonstrating initial system capacity, the R33 phase of our project will assess the performance of the robot-mediated architecture during a pilot intervention experiment. This trial will specifically evaluate system capacities as related t very young children with ASD as well as feasibility data related to recruitment and retention in the protocol, user tolerance, and patient satisfaction. This pilot trial will also assess within system improvement in joint attention skills, as well as a methodology for assessing potential generalization of such skills in a larger trial.

Public Health Relevance

The project will develop, apply, and via initial pilot study examine the feasibility of a novel adaptive robotic technology as a potential intervention tool for young children with Autism Spectrum Disorders (ASD). The proposed robotic system is designed as an 'intelligent'system that automatically adjusts intervention tasks based on information about where the child is looking (e.g., non-invasive eye gaze) in order to enhance performance on tasks of early social orienting. It is believed that the successful development and application of this new technology has the potential to accelerate movement towards accessible, personalized, targeted robot-based ASD intervention of relevance to the core neurodevelopmental deficits of the disorder.

Agency
National Institute of Health (NIH)
Institute
National Institute of Mental Health (NIMH)
Type
Exploratory/Developmental Grants (R21)
Project #
1R21MH103518-01
Application #
8680794
Study Section
Special Emphasis Panel (ZMH1-ERB-D (01))
Program Officer
Gilotty, Lisa
Project Start
2014-04-01
Project End
2016-03-31
Budget Start
2014-04-01
Budget End
2015-03-31
Support Year
1
Fiscal Year
2014
Total Cost
$248,271
Indirect Cost
$78,111
Name
Vanderbilt University Medical Center
Department
Engineering (All Types)
Type
Schools of Engineering
DUNS #
004413456
City
Nashville
State
TN
Country
United States
Zip Code
37212
Zhao, Huan; Swanson, Amy R; Weitlauf, Amy S et al. (2018) Hand-in-Hand: A Communication-Enhancement Collaborative Virtual Reality System for Promoting Social Interaction in Children with Autism Spectrum Disorders. IEEE Trans Hum Mach Syst 48:136-148
Zheng, Zhi; Zhao, Huan; Swanson, Amy R et al. (2018) Design, Development, and Evaluation of a Noninvasive Autonomous Robot-mediated Joint Attention Intervention System for Young Children with ASD. IEEE Trans Hum Mach Syst 48:125-135
Zheng, Zhi; Fu, Qiang; Zhao, Huan et al. (2017) Design of an Autonomous Social Orienting Training System (ASOTS) for Young Children With Autism. IEEE Trans Neural Syst Rehabil Eng 25:668-678
Zhang, Lian; Wade, Joshua; Bian, Dayi et al. (2017) Cognitive Load Measurement in a Virtual Reality-based Driving System for Autism Intervention. IEEE Trans Affect Comput 8:176-189
Zheng, Zhi; Warren, Zachary; Weitlauf, Amy et al. (2016) Brief Report: Evaluation of an Intelligent Learning Environment for Young Children with Autism Spectrum Disorder. J Autism Dev Disord 46:3615-3621
Zheng, Zhi; Young, Eric M; Swanson, Amy R et al. (2016) Robot-Mediated Imitation Skill Training for Children With Autism. IEEE Trans Neural Syst Rehabil Eng 24:682-91
Warren, Zachary; Zheng, Zhi; Das, Shuvajit et al. (2015) Brief Report: Development of a Robotic Intervention Platform for Young Children with ASD. J Autism Dev Disord 45:3870-6