The objective of this research is to integrate user control with automated reflexes in the human-machine interface. The approach, taking inspiration from biology, analyzes control-switching issues in brain-computer interfaces. A nonhuman primate will perform a manual task while movement- and touch-related brain signals are recorded. While a robotic hand replays the movements, electronic signals will be recorded from touch sensors on the robot?s fingers, then mapped to touch-based brain signals, and used to give the subject tactile sensation via direct cortical stimulation. Context-dependent transfers of authority between the subject and reflex-like controls will be developed based on relationships between sensor signals and command signals.
Issues of mixed authority and context awareness have general applicability in human-machine systems. This research advances methods for providing tactile feedback from a remote manipulator, dividing control appropriate to human and machine capabilities, and transferring authority in a smooth, context-dependent manner. These principles are essential to any cyber-physical system requiring robustness in the face of uncertainty, control delays, or limited information flow.
The resulting transformative methods of human-machine communication and control will have applications for robotics (space, underwater, military, rescue, surgery, assistive, prosthetic), haptics, biomechanics, and neuroscience. Underrepresented undergraduates will be recruited from competitive university programs at Arizona State University and Mexico?s Tec de Monterrey University. Outreach projects will engage the public and underrepresented school-aged children through interactive lab tours, instructional modules, and public lectures on robotics, human-machine systems, and social and ethical implications of neuroprostheses.