Nearly two million people in the United States suffer from severe motor disabilities that render them incapable of communicating with the outside world. Many of these cases involve the gradual onset of disability that offers hope for assistive technology. Amyotrophic lateral sclerosis (ALS), in particular, is a progressive muscular disease that slowly erodes a person's ability to produce motor movements, ultimately leaving its victims in a locked-in state where they are completely paralyzed with no ability to communicate. Direct brain interfaces (DBIs) are an emerging technology based on measurement of neural activation that have the potential to provide sufferers of such severe motor disability with an alternative means of communicating with the rest of the world. But to date the best performance for a DBI communication system is about 68 bits per minute (just over 8 characters per minute), which pales in comparison to the transmission rates attained by speakers and signers (175-200 words per minute). Building on the results of previous research that suggests imagined movements produce neural activations similar to executed movements (although of lesser magnitude), the PIs hypothesize that DBI communication rates could be increased by recognizing phrases of American Sign Language (ASL) from the motor cortex, and that because people who are completely locked-in are still capable of imaging motor movements although their body is unable to physically execute the movements, a DBI that recognized imagined motor movements could potentially be fully accessible by locked-in subjects.

The PIs envisage a DBI system they have called BrainSign, that would be phased in as an alternative communication device for patients diagnosed in the early stages of a progressive muscular disease such as ALS. Upon initial diagnosis patients would learn to execute useful signs and sign phrases; at this early stage in the disease's progression, BrainSign would learn the mental activity the patient displays while executing each sign. As the disease progresses and the patient loses mobility, BrainSign would adjust to recognize the mental activity for motor imagery rather than actual motor movement, so that eventually when the patient is completely locked-in BrainSign would recognize the imagined sign and display the appropriate English translation, providing an efficient method for communicating with caregivers, friends, and family. Whether this scenario can actually be achieved is unclear, hence this exploratory project whose objectives are to characterize the extent to which individual ASL gestures of varying complexity can be discriminated by means of fMRI, and then to apply this knowledge to create the first prototype portable system that recognizes ASL from brain signals.

Broader Impacts: This work will lay the foundations for DBIs that provide much higher information transmission rates than has heretofore been achievable. Such systems will ultimately be able to assist not only locked-in people, but also many others who work in mobility-restricted environments (e.g., underwater research) and in situations where vocal communication is not possible. The research will furthermore contribute to the field of cognitive neuroscience, by providing the first comprehensive study of spatially co-located, cognitively orthogonal motor tasks. The PIs will make their data and results available via a public database, so that others can improve on the results using their algorithms.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
0836747
Program Officer
Ephraim P. Glinert
Project Start
Project End
Budget Start
2008-08-01
Budget End
2009-07-31
Support Year
Fiscal Year
2008
Total Cost
$35,000
Indirect Cost
Name
Georgia Tech Research Corporation
Department
Type
DUNS #
City
Atlanta
State
GA
Country
United States
Zip Code
30332