This project, developing SocioBot-SDS (SocioBot-Spoken Dialog System), an instrument in the form of a robotic character with an emotional response, is expected to advance research involving next generation human-machine interactions. The robotic instrument will be used for therapeutic and educational purposes. Its development will specifically contribute to the research area of perceived speech and visual behaviors. The components of the instrument integrate a unique level of programmability and robustness to the display of human- and non-human-like emotive gestures with conventional orientation control through an articulated neck. The work is expected to accelerate research and development of social robots that can accurately model the dynamics of face-to-face communication with a sensitive and effective human tutor, clinician, or caregiver to a degree unachievable with current instrumentation. The robotic agent builds on advances in computer vision, spoken dialogue systems, character animation and effective computing to conduct dialogues that establish rapport with users producing rich, emotive facial gestures synchronized with prosodic speech generations in response to users' speech and emotions. The instrument represents a new level of integration of emotive capabilities that enable researchers to study socially emotive/robots/agents that can understand spoken language and show emotions and interact, speak, and communicate effectively with people in a natural way (as humans do).

The instrument provides an exciting platform for research and training supporting cross-discipline technology. Since the research community would have access to the instrument, research in how to optimize communication among people and avatars might be accelerated through the perception of speech patterns and visual behaviors of those interacting. The instrument represents a new level of integration of emotive capabilities and serves as a platform for designing a new generation of more immersive and effective intelligent tutoring and therapy systems, and robot-assisted therapeutic treatments for human disabilities that include infants at risk for sensory, attention and language delays, as well as adults with mental disabilities. From an educational perspective, the proposed activities will enhance inter-disciplinary education by involving students at all levels, during and beyond the development of the instrument. The resulting instrument will be freely distributed for researchers to investigate robotic behaviors that lead to immersive and effective applications across a variety of task domains such as teaching students to read, tutoring students in science, conducting speech and language therapy sessions, or providing companionship to elderly individuals in their homes. Moreover, the activities initiated by this development enhance interdisciplinary education, involving students at the undergraduate and graduate levels, during and beyond the development of the instrument.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Type
Standard Grant (Standard)
Application #
1427872
Program Officer
Rita Rodriguez
Project Start
Project End
Budget Start
2014-09-01
Budget End
2019-08-31
Support Year
Fiscal Year
2014
Total Cost
$1,348,000
Indirect Cost
Name
University of Denver
Department
Type
DUNS #
City
Denver
State
CO
Country
United States
Zip Code
80210