This research involves collaboration among investigators at four institutions. Recent advances in motor behavior have uncovered structure in the supporting neural control architecture, including distinctions between feed-forward and feedback control functions and learning. While the neural code has not yet been cracked, much is now known about how its foundations for sensorimotor control differ from those of even the most modern computer-based algorithms. For example, neural function must accommodate transmission and processing delays, so feedback control is subservient to feed-forward and anticipatory control. The nervous system produces exquisite, constantly and widely available predictions concerning body and environment interactions. These predictive models (also called internal models) are constructed by learning the invariants in the mapping from motor commands to sensory feedback (and inverses thereof). The PIs have developed a unique approach based upon readings from a scalp array of EEG electrodes for the construction of algorithms (decoders) which predict motor behavior (control signals) as a weighted sum of the EEG data from all electrodes at multiple time lags. The team has demonstrated two-axis control over a screen cursor using only 10 minutes of EEG and motion training data, a feat far surpassing any brain-computer interface (BCI) available to date. In the current project, the team will build upon this prior work to design and validate noninvasive neural decoders that generate agile control in upper limb prosthetics. To this end, they will investigate neural correlates of brain adaptation to multiple sources of feedback using EEG and functional near infrared spectroscopy (fNIR). An important challenge will be to provide sensory feedback appropriate to contact tasks performed with a prosthesis. Existing BCIs and neuro-prosthetic devices rely at best on vibrotactile feedback and often only on visual feedback. The PIs will add haptic and proprioceptive feedback in concert with a novel adaptation of vibrotactile, skin stretch, and arm squeeze technologies in the prosthesis interface, to provide intuitive control over contact tasks and to strengthen the motor imagery whose neural correlates are processed by the EEG decoder. To establish baseline measures, the team will compare prosthetic performance under direct brain control to myoelectric prosthetic control and direct manual control. Experiments will be performed involving both able-bodied individuals and amputees, in which real-time decoding (EEG) and analysis (EEG/fNIR) of sensorimotor control and cognitive load will be combined.

Broader Impacts: This research will revolutionize the control and interface of upper limb prosthetics. The work will lead to a better understanding of the role of sensory feedback in brain-computer interfaces and will lay the foundation for restoration of motor and sensory function for amputees and individuals with neurological disease. The project will create a unique interdisciplinary environment enabling education, training, co-advising and exchange of graduate students, course development, and involvement of undergraduates in research. The PIs will also participate in outreach activities on their various campuses, targeting underrepresented groups in science and engineering.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
1219321
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2011-11-01
Budget End
2016-05-31
Support Year
Fiscal Year
2012
Total Cost
$371,370
Indirect Cost
Name
University of Houston
Department
Type
DUNS #
City
Houston
State
TX
Country
United States
Zip Code
77204