In a scant few years, the Brain Machine Interface (BMI) has gone from science fiction to a scientific curiosity, to a rapidly growing engineering discipline with real potential for clinical importance. The realization that signals recorded from the brain might be used to control inanimate objects has captured the fascination of the popular and scientific communities alike. However, despite tremendous increase in attention and scientific work, two fundamental limitations remain: 1) The great majority of BMIs extract only kinematic (position) information from the brain, ignoring the wealth of force-related information that is also present in the primary motor cortex, and 2) virtually all existing BMIs depend exclusively on natural vision to guide movement, lacking the rapid proprioceptive feedback that is critical for normal movement. We propose to address both of these limitations by building on the progress we have made in the previous grant cycle. We previously demonstrated both joint torque and EMG predictions with accuracy comparable to that of kinematic predictions. We now propose to use this information as the basis both for a torque-based controller, and an adaptive, hybrid torque-position controller. The decoder will use inputs from both primary motor cortex and the dorsal premotor cortex. We hypothesize that this approach will allow the monkey subjects to perform more realistic tasks that require movement in a changing and changing dynamical environment. Two typical examples are the need to grasp and move an object, and the need to control both endpoint force and position, for example, when writing. We have also demonstrated that visually guided BMI performance can be improved with the addition of natural proprioception, and that monkeys can discriminate electrical stimuli of different intensity in proprioceptive areas of the cortex. We now propose to stimulate these areas to provide artificial proprioceptive feedback to the monkey. We will stimulate particular electrodes with patterns intended to mimic the signals that occur when the monkey's limb is perturbed during the movement. We hypothesize that the stimulation will cause the monkey to initiate a short latency correction in a direction determined by the particular characteristics of the stimulation. Ultimately we propose to combine the hybrid, adaptive controller with the proprioceptive prosthesis, and to test the monkey's ability to adapt to the two interfaces. We postulate that that plastic changes in the cortex, combined with algorithmic adaptation will drive improvements in performance with a time course of several days to a week. The proposed experiments will lead directly to clearer understandings of the signals encoded in both the motor and sensory areas of the brain, and the adaptive processes that are critical when a patient recovers from neurological and musculoskeletal disorders like stroke, amputation, or spinal cord injury. Furthermore, we anticipate that the developed technology will directly benefit these same patients as it is moved from the experimental arena to the clinic.
Brain machine interface (BMI) technology now allows both human and animal subjects to control the position of a robotic limb, but it does not yet work well to apply controlled forces to objects, and it cannot provide normal proprioceptive feedback to the subject. Our proposed work mimics several aspects of the intact nervous system by providing direct control of force through a torque-decoding BMI that adapts in response to position errors. In addition to natural vision, the monkey subjects will receive artificial proprioceptive feedback about the movement through electrical microstimulation of the brain.
Showing the most recent 10 out of 39 publications