Severe motor disabilities, including locked-in syndrome and paralysis, impact the quality of life for millions of people worldwide. The PIs' prior work in brain-computer interfaces based on functional near-infrared (fNIR) imaging has shown great promise for restoring communication and environmental control to people with such disabilities. Currently, typical control interfaces for these systems are simple discrete selection paradigms, which have proven to be effective but limited in information throughput rates. Innovative control interfaces based on continuous control paradigms, which dynamically map brain signal levels to control signals, have not been adequately studied for fNIR imaging. Depending upon the extent to which brain signals can be effectively mapped to continuous control, adding this feature to existing discrete control could significantly increase the range of tasks that can be performed by users of an fNIR-based direct brain interface (e.g., positional selection or 2-D drawing). In this work, the PIs will explore innovative direct brain-computer interfaces for continuous control and use them to develop applications for creative expression. For people with severe motor disabilities, creative expression can provide an emotional outlet as well as mental exercise to improve quality of life. The tasks inherent in creating visual art, such as drawing, coloring, and texturing, cannot be accomplished with discrete controls. Therefore, visual art provides an ideal experimental platform to study fNIR-based continuous control interfaces. It also provides an engaging and motivating platform for training that will improve users' abilities to control a direct brain interface. To these ends, the PIs will study non-traditional control interfaces for continuous and discrete selection such as wheels, dials, and gauges, to determine to what extent fNIR signals can be mapped to continuous control. The PIs will explore continuous methods for selection and control of art media such as brushes, colors, textures, and shapes, and investigate to what extent continuous brain signals can be translated into visual art gestures (drawing, shading, coloring). The advice of a professional, internationally-known artist who has ALS will guide the user requirements of the control interfaces. Quantitative and qualitative user performance data will be collected, and will among other things be used to compare learning effects with a visual art paradigm against traditional, discrete selection exercises to determine if training time and performance can be improved. Project outcomes will add to the body of knowledge for assistive technology and human-computer interfaces.

Broader Impacts: Methods for translating cortical oxygenation signals into continuous control signals for user interfaces will have mainstream applications for assistive technologies by essentially "smoothing" noisy input signals. Such developments could be applied for use by those with reduced motor coordination, including the elderly, young children, and those with motor diseases such as Parkinson's disease. Mainstream users may benefit from a hands-free interface, and neural control could provide added dimensions to the creative process.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
0705679
Program Officer
Ephraim P. Glinert
Project Start
Project End
Budget Start
2007-08-01
Budget End
2010-07-31
Support Year
Fiscal Year
2007
Total Cost
$291,636
Indirect Cost
Name
Georgia Tech Research Corporation
Department
Type
DUNS #
City
Atlanta
State
GA
Country
United States
Zip Code
30332