Exoskeletons have great potential to restore mobility and improve quality of life for individuals with locomotor impairments due to neurotrauma such as spinal cord injury (SCI) and stroke. For several years, clinical exoskeleton use has accelerated gait retraining by enabling patients to take more steps per session, practice more repeatable gait patterns, and have their progress more closely monitored. Today, the technology has matured to the point that multiple systems with FDA approval are commercially available. Still, broader exoskeleton adoption, both in the clinic and for personal use, is needed for the technology to achieve its full potential to impact daily life. The key barrier to such adoption is the lack of a fluent and transparent interface to determine how the user intends to move in conjunction with the exoskeleton. This project seeks to enable exoskeleton use in the diverse scenarios of daily life by developing a robust approach to determine a user's intent. Since the fundamental mechanics of walking are truly universal, the approach is to leverage relatively simple template models of walking that encode these mechanics within algorithms that more reliably identify user intent. Activities of interest include starting from rest, changing walking speed, direction, and cadence, stair ascent/descent, and stopping. The models will be applied to both healthy subjects and individuals with spinal cord injury to identify commonality and critical differences. Beyond the benefits to exoskeleton end-users, the project will facilitate outreach to both K-12 teachers and middle school students to promote education in the STEM disciplines by highlighting how engineering can directly improve quality of life.

This project aims to improve human machine interface technologies for lower-body exoskeletons by using simple template models of locomotion to more effectively capture user intent. In most existing approaches, mapping sensor data to the user's state/activity is treated as a black-box pattern recognition problem. In contrast, reduced order models or templates are low-dimensional dynamical systems that capture the fundamental mechanics of walking, upon which more complex behaviors play out. The project will investigate physics-based template models to augment inference of exoskeleton user intent for small changes to nominal walking gait (speed, direction, and cadence). Leveraging bio-inspired template control algorithms, extended Kalman filtering and non-parameteric Bayesian approaches will be investigated to solve a stochastic intent observer problem. Next, the work will be extended to detect transitions in user intent for starting, stopping, and gait progression to/from stairs. Throughout, parallel analysis will study customization of the methodology, including the use of entirely different template models, for exoskeleton users with locomotor impairments. Data-driven Floquet analysis will be applied for translation of data from healthy users to augment intent recognition in users with spinal cord injury. Experiments with the new intent recognition will be incorporated into the control system of an FDA-approved exoskeleton that will be used to assess recognition delay and generalization performance of the methods relative to existing techniques. These experiments will be conducted first with healthy subjects and subsequently with individuals with incomplete spinal cord injury. Ultimately, the project has the potential to improve the impact of exoskeletons in areas spanning rehabilitation, search-and-rescue, military, and industrial applications by more effectively capturing user intent.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1734532
Program Officer
Sylvia Spengler
Project Start
Project End
Budget Start
2017-09-01
Budget End
2021-08-31
Support Year
Fiscal Year
2017
Total Cost
$753,811
Indirect Cost
Name
University of Notre Dame
Department
Type
DUNS #
City
Notre Dame
State
IN
Country
United States
Zip Code
46556