The proliferation of brain-computer interface (BCI) technology promises locked-in patients potential ways to communicate successfully. Most BCI systems either involve selection from among a set of simultaneously presented stimuli, requiring extensive control of the interface;or use binary stimulus selection mechanisms that fail to achieve high communication rates because of slow intent detection or a fixed (context independent) ordering of stimuli. We propose a new interface using binary selection of text input via rapid serial visual presentation of natural language components. Individuals with severe speech and physical impairments (SSPI) resulting from acquired neurological disorders (amyotrophic lateral sclerosis, brainstem stroke, Parkinson's disease, multiple sclerosis, spinal cord injury) and neurodevelopmental disorders (cerebral palsy, muscular dystrophy) drive the proposed research. Four laboratories form an alliance for this translational research project: basic research (Erdogmus, engineering;Roark, computer science and natural language processing), and clinical research (Oken, neurology/neurophysiology;Fried-Oken, augmentative communication/neurogenic communication disorders).
Our aims are (1) to develop an innovative EEG-based BCI that achieves increased communication rates with fewer errors and greater satisfaction for the target SSPI populations;(2) to iteratively refine the system in the laboratory with user feedback from healthy subjects and expert LIS users of marketed AAC systems;(3) to evaluate the performance of the system within the natural clinical settings of SSPI patients. The innovative BCI is the RSVP Keyboard with three essential features: (1) rapid serial visual presentation (RSVP) of linguistic components ranging from letters to words to phrases;(2) a detection mechanism that employs multichannel electroencephalography (EEG) and/or other suitable response mechanisms that can reliably indicate the binary intent of the user and adapt based on individualized neurophysiologic data of the user;and (3) an open-vocabulary natural language model with a capability for accurate predictions of upcoming text. Theoretical framework is based on a solid Bayesian foundation;clinical usability is based on the WHO ICF (WHO, 2001) and an Augmentative and Alternative Communication (AAC) model of participation. Rigorous experimental scrutiny in both clinical laboratory and natural settings will be obtained with able-bodied subjects and SSPI patients. Measures of learning rate, speed of message production, error rate and user satisfaction for different iterations of the RSVP keyboard will be obtained using an hypothesis-driven crossover design for 36 healthy subjects, and alternating treatment randomization design for 40 patients with SSPI. Descriptions of the motor, cognitive, and language skills of LIS patients using the novel system in their natural environments will inform clinical guidelines and functional device adaptations to better individualize treatment for children and adults with SSPI. The collaborative nature of the proposed translational research is expected to yield new knowledge for both BCI development and clinical AAC use. Relevance: The populations of patients with locked-in syndrome are increasing as medical technologies advance and successfully support life. These individuals with limited to no movement could potentially contribute to their medical decision making, informed consent, and daily care giving if they had faster, more reliable means to interface with communication systems. The RSVP keyboard and proposed language models are innovative technological discoveries that are being applied to clinical augmentative communication tools so that patients and their families can participate in daily activities and advocate for improvements in standard clinical care. The proposed project stresses the translation of basic computer science into clinical care, supporting the proposed NIH Roadmap and public health initiatives.

Public Health Relevance

The populations of patients with locked-in syndrome are increasing as medical technologies advance and successfully support life. These individuals with limited to no movement could potentially contribute to their medical decision making, informed consent, and daily care giving if they had faster, more reliable means to interface with communication systems. The RSVP keyboard and proposed language models are innovative technological discoveries that are being applied to clinical augmentative communication tools so that patients and their families can participate in daily activities and advocate for improvements in standard clinical care. The proposed project stresses the translation of basic computer science into clinical care, supporting the proposed NIH Roadmap and public health initiatives.

Agency
National Institute of Health (NIH)
Institute
National Institute on Deafness and Other Communication Disorders (NIDCD)
Type
Research Project (R01)
Project #
5R01DC009834-04
Application #
8213637
Study Section
Special Emphasis Panel (ZDC1-SRB-R (35))
Program Officer
Shekim, Lana O
Project Start
2009-02-01
Project End
2014-01-31
Budget Start
2012-02-01
Budget End
2013-01-31
Support Year
4
Fiscal Year
2012
Total Cost
$699,362
Indirect Cost
$238,578
Name
Oregon Health and Science University
Department
Neurology
Type
Schools of Medicine
DUNS #
096997515
City
Portland
State
OR
Country
United States
Zip Code
97239
Goodrich, Elena; Wahbeh, Helané; Mooney, Aimee et al. (2015) Teaching mindfulness meditation to adults with severe speech and physical impairments: An exploratory study. Neuropsychol Rehabil 25:708-32
Fried-Oken, Melanie; Mooney, Aimee; Peters, Betts et al. (2015) A clinical screening protocol for the RSVP Keyboard brain-computer interface. Disabil Rehabil Assist Technol 10:8-Nov
Oken, Barry S; Orhan, Umut; Roark, Brian et al. (2014) Brain-computer interface with language model-electroencephalography fusion for locked-in syndrome. Neurorehabil Neural Repair 28:387-94
Orhan, Umut; Erdogmus, Deniz; Roark, Brian et al. (2013) Offline analysis of context contribution to ERP-based typing BCI performance. J Neural Eng 10:066003
Roark, Brian; Beckley, Russell; Gibbons, Chris et al. (2013) Huffman scanning: using language models within fixed-grid keyboard emulation. Comput Speech Lang 27: