Effective robotic assistance of infants with or at risk of developing Cerebral Palsy (CP) has the potential to reduce the significant functional limitations as well as the potential deficits in cognitive development. This project focuses on the development and testing of a sequence of robotic assistants that promote early crawling, creeping, and walking, along with a model of infant-robot interaction that encourages the continued practice of movement patterns that will ultimately lead to unassisted locomotion. Typically developing infants initially learn to crawl through the generation of spontaneous limb and trunk movements. Early in the process, these spontaneous movements transport the infant across the floor. The rewarding locomotory experience drives the infant to refine the movements to intentional and exploratory skills. Ultimately, the infant intentionally engages these skills to solve larger problems, such as obtaining an interesting toy or exploring the environment. Infants with conditions such as CP lack the muscle strength, postural control, and motor coordination necessary for these early exploratory limb and trunk movements to result in locomotion. Without this positive feedback, the development of the neural pathways for productive limb use is diminished, which results in delayed or lack of development of crawling and walking. These limitations in mobility negatively affect other domains of development such as perception and cognition, with effects being visible even into adulthood.

The robotic assistants to be developed in this project will aid the infant in developing locomotory skills by selectively supporting a portion of his/her weight and providing artificial, rewarding locomotory experiences. The PI's approach to infant-robot interaction is to first instrument the infant with a set of sensors, allowing for reconstruction of the trunk and limb positions in real time. A semi-supervised clustering process will then identify a menu of canonical spatio-temporal limb and trunk movement patterns given observations of behavior that is exhibited by children who are either typically developing or at risk of developing CP. The robot will respond to the recognition of a canonical movement by assisting in the corresponding postural support and transport of the child. The PI's hypothesis is that this positive feedback will encourage the continued practice of the canonical movements, as well as their use in solving larger problems. The infant-robot interaction model will selectively reward specific canonical movements as different levels of capabilities are exhibited. As the child becomes proficient at using a simple movement to trigger robotic assistance, the robot will reduce (and ultimately eliminate) its response to that particular canonical movement. Other canonical movements that encode related, but more complex and/or coordinated limb movements, will continue to be available. As the limb movements are mastered the vertical support will be reduced to encourage the infant to bear more of his/her own weight. The hypothesis is that this early intervention approach will help to guide the child along a progressive developmental trajectory that will end with locomotory skills and muscle strength that require little or no assistance. EEG-based neuroimaging will be used to monitor the progression of the infant's development. The hypothesis is that the degree of proficiency of certain skills will be identifiable using the EEG index related to motor output. This information will be used to guide the semi-supervised clustering process, as well as the decision process for selectively rewarding certain canonical movements.

Broader Impacts: Equipping children with CP at an early age with locomotory skills will not only bring them more in line with typically developing children, but will also reduce their reliance on long-term care while increasing their success in self-help, in education, and in the workplace. The techniques will be applicable to a range of other childhood disorders (including Down Syndrome), to retraining patients following stroke, and to the creation of tunable gestural interfaces for intelligent prostheses.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1208639
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2012-10-01
Budget End
2017-09-30
Support Year
Fiscal Year
2012
Total Cost
$1,175,000
Indirect Cost
Name
University of Oklahoma
Department
Type
DUNS #
City
Norman
State
OK
Country
United States
Zip Code
73019