The purpose of this application is to explore how high level planning signals can be used for control of neural prosthetics to assist paralyzed patients. This study has both scientific and engineering components. The scientific investigations entail exploring how cognitive signals related to movement intentions are encoded in the parietal-frontal circuits. The engineering component will be guided by the scientific findings to best design and tailor algorithms for decoding these cognitive signals. Areas of algorithmic development include new signal processing and feature extraction techniques, extensions of Bayesian classification and Kalman filtering algorithms, and new applications of speech recognition and finite state machine techniques.
Aim 1 will examine how the goals of reach movements are represented in 3 dimensions in the parietal reach region (PRR) and the dorsal premotor cortex (PMd) and develop decode algorithms to control the location of a cursor using these signals (so-called brain-control task). Goal decoding has the attributes of being very versatile and rapid for prosthetics applications.
This aim will also determine if goal locations can be decoded using local field potentials (LFPs) rather than spike activity using advanced signal processing techniques. An advantage of LFPs for prosthetics is their ease and longevity of recording.
Aim 2 will examine whether neural activity in PRR and PMd predicts the current location of the limb during trajectory movements, and if this """"""""forward model"""""""" can be used to generate trajectories in brain control tasks. Techniques suited for continuously varying dynamic systems will be applied to decoding the trajectories.
Aim 3 will study plasticity in PRR and PMd related to context, learning and reward. In this aim we will examine how the ability of the brain to learn and adapt can lead to better performance of brain-machine interfaces.
Aim 4 will examine the very challenging situation of decoding movement plans continuously. Studies in this field generally use event markers derived from the trials of a task to assist decoding. However, these markers will not exist for clinical applications of prosthetics and the problem of recognizing and interpreting neural signals becomes much more challenging. We will apply and extend techniques from speech recognition and finite state machines to this problem. In particular, we will examine how eye movement information during natural hand-eye coordination can help decode reach movements from neural activity. This eye movement information will be derived from eye movement recordings and from the recording of neural signals related to eye movements. Knowledge from this work will be applied to brain- control tasks involving the continuous, sequential determination of goals.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY013337-09
Application #
7668526
Study Section
Special Emphasis Panel (ZRG1-BDCN-K (10))
Program Officer
Steinmetz, Michael A
Project Start
2001-02-01
Project End
2011-08-31
Budget Start
2009-09-01
Budget End
2010-08-31
Support Year
9
Fiscal Year
2009
Total Cost
$397,063
Indirect Cost
Name
California Institute of Technology
Department
Type
Schools of Arts and Sciences
DUNS #
009584210
City
Pasadena
State
CA
Country
United States
Zip Code
91125
Graf, Arnulf B A; Andersen, Richard A (2015) Predicting oculomotor behaviour from correlated populations of posterior parietal neurons. Nat Commun 6:6024
Klaes, Christian; Kellis, Spencer; Aflalo, Tyson et al. (2015) Hand Shape Representations in the Human Posterior Parietal Cortex. J Neurosci 35:15466-76
Aflalo, Tyson; Kellis, Spencer; Klaes, Christian et al. (2015) Neurophysiology. Decoding motor imagery from the posterior parietal cortex of a tetraplegic human. Science 348:906-10
Andersen, Richard A; Andersen, Kristen N; Hwang, Eun Jung et al. (2014) Optic ataxia: from Balint's syndrome to the parietal reach region. Neuron 81:967-983
Andersen, Richard A; Kellis, Spencer; Klaes, Christian et al. (2014) Toward more versatile and intuitive cortical brain-machine interfaces. Curr Biol 24:R885-R897
Hwang, Eun Jung; Hauschild, Markus; Wilke, Melanie et al. (2014) Spatial and temporal eye-hand coordination relies on the parietal reach region. J Neurosci 34:12884-92
Graf, Arnulf Ba; Andersen, Richard A (2014) Inferring eye position from populations of lateral intraparietal neurons. Elife 3:e02813
Graf, Arnulf B A; Andersen, Richard A (2014) Brain-machine interface for eye movements. Proc Natl Acad Sci U S A 111:17630-5
Revechkis, Boris; Aflalo, Tyson N S; Kellis, Spencer et al. (2014) Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task. J Neural Eng 11:066014
Hwang, Eun Jung; Andersen, Richard A (2013) The utility of multichannel local field potentials for brain-machine interfaces. J Neural Eng 10:046005

Showing the most recent 10 out of 20 publications