Interacting with the physical environment and manipulating objects is an essential part of daily life. Thisability is lost in upper-limb amputees as well as patients with spinal cord injury, stroke, ALS and othermovement disorders. These people know what they want to do as well as how they would do it if their armswere functional. If such knowledge is decoded and sent to a prosthetic arm (or to the patient's own arm fittedwith functional electric stimulators) the lost motor function could be restored. The decoding is unlikely to beperfect however the brain can adapt to an imperfect decoder using real-time feedback. Several groupsincluding ours have recently demonstrated that at least in principle this can be achieved. However, as isoften the case in science, the initial work has been done in idealized conditions and its applicability toreal-world usage scenarios remains an open question. The goal of this project is to bring movement controlbrain-machine interfaces (BMIs) closer to helping the people who need them, and at the same time exploitthe rich datasets we collect in order to advance our understanding of sensorimotor control and learning. Thiswill be accomplished by creating hybrid BMIs which exploit information from multiple sources, combined withmodern algorithms from machine learning and automatic control.
Being able to interact with the physical environment and manipulate objects is an essential part of daily life.Brain-machine interfaces are one way to restore this ability to patients who have lost it. The proposedproject will bring brain-machine interfaces closer to helping patients in real-worid object manipulation tasks.