Project Proposed: This project, acquiring equipment to create an instrument for social robot learning from human infant studies, aims to transform the development of human-interactive robots by applying fundamental scientific paradigms from developmental psychology studies in infants to new ways of designing autonomous robots. Specifically, the research utilizes data from human infant perceptual-motor learning studies in reaching and grasping to build comprehensive models of human sensorimotor control that achieve embodied learning like that exhibited by infants. These models will then be instantiated in a robotic setting to create a machine that can autonomously acquire shared object manipulation skills through bottom-up and top-down processes mimicking human infant learning. The equipment will facilitate multiple research thrusts within the grand scope of understanding and improving human-robot interaction. The thrusts include: - Study of biologically-plausible visual attention mechanisms, - Adaptive visually-guided motor skills, and - Inference of human state. Infant visual search patterns, from which models of perception will be developed, will be acquired via the eye tracking system. The robotic arm system will serve as a platform for extensive studies on perceptual motor skills. The instrumentation should enable the following outcomes: - Yield robotic systems able to learn to physically interact with humans through shared object manipulation. These systems will learn skills that allow handling of previously unseen objects. - Facilitate human-robot interaction, especially regarding manipulation, grasping, and handling capabilities, as well as eye-tracking, to better study the role of vision in infant grasp and reach learning. - Contribute significantly to the integration of two bodies of research ? psychology and engineering ? coupled through the computational models built. These computational models will provide a two-way exchange of ideas between psychology and engineering, - Lead both to the design of new experiments in psychology and to new mechanisms for interactive robots. These systems will learn skills that allow handling of previously unseen objects. Facilitating robot interaction will permit to better study the role of vision in infant grasp and reach learning. These computational models will provide a two-way exchange of ideas between psychology and engineering leading to the design of new experiments in psychology and to new mechanisms for interactive robots. Broader Impacts The instrument, initially used by 7 faculty in 3 departments across 2 colleges, will offer new opportunities for cross-disciplinary training in the fields of cognitive psychology, developmental cognitive neuroscience, computer science, computer engineering, electrical engineering, and mechanical engineering, for both graduate and undergraduate students. The enabled research is also expected to offer significant practical societal impact, as many potential applications of human-robot interaction involve the exchange of objects (e.g., an assistive robot picking up a dropped TV remote control for a disabled person, a delivery robot handing a package to a human, or a therapeutic robot handing a toy to an autistic child). The work may have implications for addressing developmental problems in children, emanating from the increased understanding of the perceptual and motor learning processes in infants. Moreover, the instrumentation provides experiental and cross-disciplinary opportunities that buttress classroom theory, thus providing a positive impact to education, students, faculty, K-12 teachers, museums, etc.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Type
Standard Grant (Standard)
Application #
1229176
Program Officer
Rita Rodriguez
Project Start
Project End
Budget Start
2012-09-01
Budget End
2017-08-31
Support Year
Fiscal Year
2012
Total Cost
$312,844
Indirect Cost
Name
University of Tennessee Knoxville
Department
Type
DUNS #
City
Knoxville
State
TN
Country
United States
Zip Code
37916