The objective of this research is to create interfaces that enable people with impaired sensory-motor function to control interactive cyber-physical systems such as artificial limbs, wheelchairs, automobiles, and aircraft. The approach is based on the premise that performance can be significantly enhanced merely by warping the perceptual feedback provided to the human user. A systematic way to design this feedback will be developed by addressing a number of underlying mathematical and computational challenges.
The intellectual merit lies in the way that perceptual feedback is constructed. Local performance criteria like stability and collision avoidance are encoded by potential functions, and gradients of these functions are used to warp the display. Global performance criteria like optimal navigation are encoded by conditional probabilities on a language of motion primitives, and metric embeddings of these probabilities are used to warp the display. Together, these two types of feedback facilitate improved safety and performance while still allowing the user to retain full control over the system.
If successful, this research could improve the lives of people suffering from debilitating physical conditions such as amputation or stroke and also could protect people like drivers or pilots that are impaired by transient conditions such as fatigue, boredom, or substance abuse. Undergraduate and graduate engineering students will benefit through involvement in research projects, and K-12 students and teachers will benefit through participation in exhibits presented at the Engineering Open House, an event hosted annually by the College of Engineering at the University of Illinois.
The objective of this research has been to create interfaces that enable people with impaired sensory-motor function to control interactive cyber-physical systems such as artificial limbs, wheelchairs, automobiles, and aircraft. The approach was based on the premise that performance could be significantly enhanced merely by warping the perceptual feedback provided to the human user. A systematic way to design this feedback was developed by addressing a number of underlying mathematical and computational challenges. The intellectual merit derived from the way that perceptual feedback was constructed. Local performance criteria like stability and collision avoidance were encoded by potential functions, and gradients of these functions were used to warp the display. Global performance criteria like optimal navigation were encoded by conditional probabilities on a language of motion primitives, and metric embeddings of these probabilities were used to warp the display. Together, these two types of feedback improved safety and performance while still allowing the user to retain full control over the system. The broader impact derived from potential improvement in the lives of people suffering from debilitating physical conditions such as amputation or stroke, and also from potential safeguards for drivers or pilots that are impaired by transient conditions such as fatigue, boredom, or substance abuse. Specific contributions included the following: 1) An interface that enables a human pilot to remotely teleoperate an unmanned aircraft flying at a fixed altitude with input only from an electroencephalograph, using a paradigm called motor imagery. This interface was enabled by deriving an optimal communication protocol that tells exactly what type of perceptual feedback should be provided to the user. 2) An interface that enables a human to type at a computer at speeds of more than 10 characters per minute with input only from an electroencephalograph, using a paradigm called the steady-state visually evoked potential. This interface worked by asking a sequence of questions, such as: "Is the desired character in the range A-E, F-M, or N-Z?" Active inference was used to choose the question that would most quickly predict the desired character.