This application will result in a technological platform that re-empowers persons with severe paralysis, by allowing them to independently control a wide spectrum of robotic actions. Severe paralysis is devastating, and chronic?and reliance on caregivers is persistent. Assistive machines such as wheelchairs and robotic arms offer a groundbreaking path to independence: where control over their environment and interactions is returned to the person. However, to operate complex machines like robotic arms and hands typically poses a dif?cult learning challenge and requires complex control signals?and the commercial control interfaces accessible to persons with severe paralysis (e.g. sip-and-puff, switch-based head arrays) are not adequate. As a result, assistive robotic arms remain largely inaccessible to those with severe paralysis?arguably the population who would bene?t from them most. The purpose of the proposed study is to provide people with tetraplegia with the means to control robotic arms with their available body mobility, while concurrently promoting the exercise of available body motions and the maintenance of physical health. Control interfaces that generate a unique map from a user's body motions to control signals for a machine offer a customized interaction, however these interfaces have only been used to issue low-dimensional (2-D) control signals whereas more complex machines require higher-dimensional (e.g. 6-D) signals. We propose an approach that leverages robotics autonomy and machine learning in order to aid the end-user in learning how to issue effective higher-dimensional control signals through body motions. Speci?cally, initially the human issues a lower- dimensional control signal and robotics autonomy is used to bridge the gap by taking over whatever is not covered by the human's control signal. Help from the robotics autonomy is then progressively scaled back, automatically, to cover fewer and fewer control dimensions as the user becomes more skilled. The ?rst piece to our approach deals with how to extract control signals from the human, using the body-machine interface. The development and optimization of decoding procedures for controlling a robotic arm using residual body motions will be addressed under Speci?c Aim 1. The second piece to our approach deals with how to interpret control signals from a human within a paradigm that shares control between the human and robotics autonomy. To identify which shared-control formulations most effectively utilize the human's control signals will be the topic of Speci?c Aim 2. The ?nal piece to our approach deals with how to adapt the shared-control paradigm so that more control is transferred to the human over time. This adaptation is necessary for the human's learning process, since the goal in the end is for the human to be able to fully control the robotic arm him/herself, and will be assessed under Speci?c Aim 3. At the completion of this project, tetraplegic end-users will be able to operate a robotic arm using their residual body motions, through an interface that both promotes the use of residual body motions (and thus also recovery of motor skill) and adapts with the human as their abilities change over time. By leveraging adaptive robotics autonomy, our application moreover provides a safe mechanism to facilitate learning how to operate the robotic platform.

Public Health Relevance

Frequently the more severe a person's motor impairment, the less able they are to operate the very assistive machines, like powered wheelchairs and robotic arms, which might enhance their quality of life. The purpose of the proposed study is to provide people with tetraplegia with the means, through non-invasive technologies, to control robotic arms with their available body mobility, while concurrently promoting the exercise of available body motions and the maintenance of physical health. We propose and study an approach that leverages robot machine learning and autonomy in order to facilitate human motor learning of how to operate a robotic arm using a customized interface, in order to make powered assisted manipulation more accessible to people with severe paralysis. 1

Agency
National Institute of Health (NIH)
Institute
National Institute of Biomedical Imaging and Bioengineering (NIBIB)
Type
Research Project (R01)
Project #
5R01EB024058-02
Application #
9773039
Study Section
Bioengineering of Neuroscience, Vision and Low Vision Technologies Study Section (BNVT)
Program Officer
Peng, Grace
Project Start
2018-09-01
Project End
2022-05-31
Budget Start
2019-06-01
Budget End
2020-05-31
Support Year
2
Fiscal Year
2019
Total Cost
Indirect Cost
Name
Rehabilitation Institute of Chicago D/B/A Shirley Ryan Abilitylab
Department
Type
DUNS #
068477546
City
Chicago
State
IL
Country
United States
Zip Code
60611