Millions of people worldwide suffer from sensory and motor deficits caused by neurological conditions, such as spinal cord injury. While currently available treatments are not efficient, emerging neuroprosthetic technologies hold promise to cure these devastating conditions. Neural prosthetics are based on brain-machine interfaces (BMI) that connect the brain areas unaffected by injury directly to prosthetic limbs. It is expecte that BMIs and other neuroprosthetics will reduce the burden of neurological disease by improving mobility, the ability to sense the external world, and overall quality of life. The goal f this project is to develop a sensorized neural prosthetic that enacts bimanual arm tasks. I will explore intracortical microstimulation (ICMS) of primary somatosensory cortex (S1) as the means to provide artificial tactile input for this two-armed BMI. The experiments will be conducted in rhesus monkeys chronically implanted with multielectrode cortical arrays. The project will include three specific aims: 1. Determine cortical correlates of bimanual decision-making guided by ICMS I hypothesize that ICMS input will be processed by multiple cortical areas, with bimanual aspects of the tasks handled by nonprimary motor areas, such as the supplementary motor area. To test this hypothesis, two rhesus monkeys will be implanted in both hemispheres in multiple cortical areas. They will operate two joysticks to move avatar arms on the computer screen. ICMS of the primary somatosensory areas will guide this behavior by instructing which arm to use for a particular task and indicating the target of reaching movements. This experiment will illuminate the functional role of each cortical area. 2. Develop decoding algorithms to extract bimanual motor intentions I hypothesize that I will be able to extract the parameters of bimanual movements guided by ICMS from the activity of multiple cortical areas. I will use BMI decoding algorithms for this purpose. The algorithms will implement multiplexing schemes to separate the epochs when ICMS is applied from those when cortical activity is reliably recorded. BMI decoders will extract position and velocity of the avatar arms from prerecorded neural data and in real-time. I will optimize the decoders. 3. Implement a real-time brain-machine-brain interface for bimanual control I hypothesize that monkeys will learn to control two avatar arms by their cortical activity while receiving artificial sensory inputs. To tet this hypothesis, the joystick control of the avatar arms will be replaced by direct brain control. BMI decoders developed in Specific Aim 2 will be used to achieve this control. I expect that brain circuitry will plastically adapt to improve the task performance.
This aim will result in a prototype bimanual prosthetic that will be in the future introduced to clinical practice.
Millions of people around the world suffer from sensory and motor deficits caused by partial or complete paralysis. Brain machine interfaces (BMI) offer such patients the capability to regain mobility and sensation by connecting the motor commands from intact cortical to prosthetic devices. Sensorized prosthetic devices can provide tactile information back to the brain via intracortical microstimulation (ICMS). The proposed project seeks to advance our understanding of how the brain controls bimanual movements guided by ICMS input.
|Ifft, Peter J; Shokur, Solaiman; Li, Zheng et al. (2013) A brain-machine interface enables bimanual arm movements in monkeys. Sci Transl Med 5:210ra154|