The promise of brain-computer interfaces (BCI) for communication is becoming a reality for individuals with severe speech and physical impairments (SSPI) who cannot rely on speech or writing to express themselves. While the majority of research efforts are devoted to technology development to address problems of stability, reliability and/or classification, clinical and behavioral challenges are becoming more apparent as individuals with SSPI and their family/care teams assess the systems during novice or long-term trials. The objective of the RSVP Keyboard(tm) BCI translational research team is to address the clinical challenges raised during functional BCI use with innovative engineering design, thereby enhancing the potential of this novel assistive technology.
Four specific aims are proposed: (1) to develop a BCI Communication Application Suite (BCI-CAS) that offers a set of language modules to people with SSPI that can meet their language/literacy skills; (2) to develop improved statistical signal models for personalized feature extraction, artifact/interference handling, and robust, accurate intent evidence extraction from physiologic signals; (3) to develop improved language models and stimulus sequence optimization methods; and (4) to evaluate cognitive variables that affect learning and performance of the BCI-CAS. Five language modules are proposed that rely on a multimodal evidence fusion framework for model-based context-aware optimal intent inference: RSVP Keyboard(tm) generative spelling; RSVP texting; RSVP in-context typing; RSVP in-context icon typing; and binary yes/no responses with SSVEPs. Usability data on the current RSVP Keyboard(tm) and SSVEP system drive all proposed aims. Users select a language module, and the BCI system optimizes performance for each individual based on user adaptation, intent inference, and personalized language modeling. A unique simulation function drives individualization of system parameters. The robustness of the BCI customization efforts are evaluated continually by adults with SSPI and neurotypical controls in an iterative fashion. The effect of three intervention programs that address the cognitive construct of attention (process-specific attention training, mindfulness meditation training and novel stimulus presentations) will be implemented through hypothesis-driven single subject designs. Thirty participants, ages 21 years and older with SSPI will be included in home-based interventions. By measuring information transfer rate (ITR), user satisfaction, and intrinsic user factors, we will identify learning strategies that influence BCI sill acquisition and performance for adults with neurodegenerative or neurodevelopmental conditions. The translational teams include (1) signal processing (Erdogmus); (2) clinical neurophysiology (Oken); (3) natural language processing (Bedrick/Gorman); and (4) assistive technology (Fried-Oken). We continue to rely on a solid Bayesian foundation and theoretical frameworks: ICF disability classification (WHO, 2001), the AAC model of participation (Beukelman & Mirenda, 2013) and the Matching Person to Technology Model (Scherer, 2002).
The populations of patients with severe speech and physical impairments secondary to neurodevelopmental and neurodegenerative diseases are increasing as medical technologies advance and successfully support life. These individuals with limited to no movement could potentially contribute to their medical decision making, informed consent, and daily caregiving if they had faster, more reliable means to interface with communication systems. The BCI Communication Applications Suite is a hybrid brain-computer interface that is an innovative technological advance so that patients and their families can participate in daily activities and advocate for improvements in standard clinical care. The proposed project stresses the translation of basic computer science into clinical care, supporting the proposed NIH Roadmap and public health initiatives.
|Peters, Betts; Mooney, Aimee; Oken, Barry et al. (2016) SOLICITING BCI USER EXPERIENCE FEEDBACK FROM PEOPLE WITH SEVERE SPEECH AND PHYSICAL IMPAIRMENTS. Brain Comput Interfaces (Abingdon) 3:47-58|
|Andresen, Elena M; Fried-Oken, Melanie; Peters, Betts et al. (2016) Initial constructs for patient-centered outcome measures to evaluate brain-computer interfaces. Disabil Rehabil Assist Technol 11:548-57|
|Goodrich, Elena; Wahbeh, HelanÃ©; Mooney, Aimee et al. (2015) Teaching mindfulness meditation to adults with severe speech and physical impairments: An exploratory study. Neuropsychol Rehabil 25:708-32|
|Fried-Oken, Melanie; Mooney, Aimee; Peters, Betts (2015) Supporting communication for patients with neurodegenerative disease. NeuroRehabilitation 37:69-87|
|Peters, Betts; Bieker, Gregory; Heckman, Susan M et al. (2015) Brain-computer interface users speak up: the Virtual Users' Forum at the 2013 International Brain-Computer Interface Meeting. Arch Phys Med Rehabil 96:S33-7|
|Roark, Brian; Fried-Oken, Melanie; Gibbons, Chris (2015) Huffman and linear scanning methods with statistical language models. Augment Altern Commun 31:37-50|
|Fried-Oken, Melanie; Mooney, Aimee; Peters, Betts et al. (2015) A clinical screening protocol for the RSVP Keyboard brain-computer interface. Disabil Rehabil Assist Technol 10:11-8|
|Oken, Barry S; Orhan, Umut; Roark, Brian et al. (2014) Brain-computer interface with language model-electroencephalography fusion for locked-in syndrome. Neurorehabil Neural Repair 28:387-94|
|Orhan, Umut; Erdogmus, Deniz; Roark, Brian et al. (2013) Offline analysis of context contribution to ERP-based typing BCI performance. J Neural Eng 10:066003|
|Roark, Brian; Beckley, Russell; Gibbons, Chris et al. (2013) Huffman scanning: using language models within fixed-grid keyboard emulation. Comput Speech Lang 27:|
Showing the most recent 10 out of 17 publications