Autism Spectrum Disorder (ASD) is the most common neuro-developmental disorder among adolescents in the United States, with an estimated prevalence rate of one in 59. The major characteristics include deficits in social, sensory, and emotional processing. Boys have about four times higher rate of diagnosis than girls. General interventions such as behavior and speech/language therapies have been developed, and socially assistive robots have shown effective and positive outcomes in communicating with and delivering interactive interventions to individuals with ASD. However, a major obstacle to broadening the impact of robotic assistance and intervention in real-world settings is the lack of a robotic framework that can adaptively learn diverse interaction skills over time and associate with new social contexts as the human counterpart develops. Furthermore, recent studies are showing that girls are under-diagnosed and thus under-served due to their higher sensitivity in social cues resulting in social camouflage that often causes their symptoms to go undetected. Thus, there is a demonstrated need for intelligent socially-assistive robotic systems that can cope with the developmental processes and gender-specific characteristics of children with ASD. To address this need, the project aims to develop a novel socio-emotional human-robot interaction framework to provide interactive emotional regulation and guidance through robot-initiated conversation and gestures in response to the individual's unique socio-emotional cues. These cues can be recognized through analysis of voice signals, facial expressions, gestures and conversation. The successfully developed framework, which will be implemented on virtual and physical robotic platforms, will promote healthier and emotionally-balanced living for individuals with socio-emotional processing disorders or deficits. The knowledge gained through this project will be directly fused into educational activities for the future generation and students from underrepresented groups to develop creative mindsets and analytic skills in science, technology, engineering, art, and math (STEAM). Collaborative outreach activities include: applying the technology developed in the George Washington Autism & Neurodevelopmental Disorders Institute (ANDI) facility; working with the Kennedy Krieger Institute through Robotic Design workshops, Summer Robotics camps and providing Robotic interactions in school environments; and participating in the Take2 Summer Camp program that offers a 4-week therapeutic camp for children who have difficultly functioning in the social world.
The PI's long-term career research goal is to understand the fundamental principles of human interactions and behaviors and translate these mechanisms into computational modeling and algorithms for a novel assistive robotic framework. Toward this goal, this project will develop a socially assistive robotic framework with contextual ambidexterity that is perceptive of personal socio-emotional states, capable of learning social skills, emotionally interactive, and gender-smart for long-term human-robot interaction and intervention (LT-HRI). Contextual ambidexterity, which investigates the key metrics for synchronizing two strategies--how to best exploit given functionality and resources in performing a task and how to efficiently explore new skills and knowledge to gain social intelligence over time--is very applicable to the robotic agent that needs to be able to determine best selections from programmed skills (exploitation) when faced with well-perceived situations but needs to also be able to learn new skills and contexts (exploration) when faced with unknown situations. To evaluate the efficacy of the framework in the real world, an active learning robotic agent will be developed to assist in socio-emotional LT-HRI for adolescents with autism spectrum disorder (ASD). The Research Plan is organized under four tasks. The FIRST TASK is to achieve advanced emotional perception tailored to an individual's unique socio-emotional cues. A voice-based-emotion-estimation module will be developed and combined with a widely used facial expression analysis module to control an interactive communication module that can interact with the user to acquire more accurate emotional characteristics if needed. The SECOND TASK is to learn social gestures and contexts from personal interactions and communications. Algorithms have been developed to capture the real-time sequence of a human gesture and to generate "gesture-features" that will be passed to a module that will check to see if the feature is already learned or, if not, will be registered as a new gesture. Once the gesture is learned, the robot will practice the learned behavior and observe the user's response to learn/update the social context of the gesture. The THIRD TASK is to develop an efficient framework for modeling emotional interaction and regulation between a human user and a robotic agent and to develop controllable algorithms for effective rapport formation. Emotional agents include human emotion, robotic emotion and a target emotional goal for emotional regulation and therapy. In the "rapport forming" phase, the robot's emotional goal is designed to approach the human's emotional goal and establish a common bond of empathy. In the "emotional guidance" phase, if the "rapport phase" is failing, the robot will take a proactive role in moving the human's emotional state toward the target emotion. The FOURTH TASK is to develop a gender-smart robotic intervention system, i.e., an optimal interaction policy for gender-specific social environments. Once the robot has perceived the user's emotional state, the robot can select subsequent actions to increase the user's engagement/participation level. Gender-smart behavior planning will be based on Partial Observable Markov Decision Process(POMDP) models of each user's interaction patterns/preferences, which can be used to maximize the total expected reward and determine the next action to be taken by the robot. POMDP models results can be incorporated into gender-specific representations (e.g. heatmaps) that can be used to anticipate gender differences. A set of experiments with human participants (neurotypical adolescents and adolescents with ASD, ages 10-19) will be designed to validate algorithms and assess the overall effectiveness of the robotic framework. The specifications, training data, and algorithm outcomes produced by the project will be openly disseminated for researchers as open-source Robot Operating System (ROS) packages and repositories.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.