The increasing availability of affordable, high-performance virtual reality (VR) headsets creates great potential for applications including education, training, and therapy. In many applications, being able to sense a user's mental state could provide key benefits. For instance, VR environments could use brain signals such as the electroencephalogram (EEG) to infer aspects of the user's mental workload or emotional state; this, in turn, could be used to change the difficulty of a training task to make it better-suited to each user's unique experience. Using such EEG feedback could be valuable not just for training, but in improving people's performance in real applications including aviation, healthcare, defense, and driving. This project's goal is to develop methods and algorithms for integrating EEG sensors into current VR headsets, which provide a logical and unobtrusive framework for mounting these sensors. However, there are important challenges to overcome. For instance, EEG sensors in labs are typically used with a conducting gel, but for VR headsets these sensors will need to work reliably in "dry" conditions without the gel. Further, in lab settings, motion isn't an issue, but algorithms for processing the EEG data will need to account for people's head and body motion when they are using headsets.

To address these challenges, the project team will build on recent advances in dry EEG electrode technologies and motion artifact suppression algorithms, focusing on supporting passive monitoring and cognitive state feedback. Such passive feedback is likely to be more usable in virtual environments than active EEG feedback, both because people will be using other methods to interact with the environment directly and because passive EEG sensing is more robust to slower response times and decoding errors than active control. Prior studies have demonstrated the potential of EEG for cognitive-state decoding in controlled laboratory scenarios, but practical EEG integration for closed-loop neurofeedback in interactive VR environments requires addressing three critical next questions: (1) can more-practical and convenient EEG dry sensors achieve comparable results to wet sensors?, (2) can passive EEG cognitive-state decoding be made robust to movement-related artifacts?, and (3) can these decoding schemes be generalized across a variety of cognitive tasks and to closed-loop paradigms? To address these questions, classical cognitive tasks and more-complex simulator tasks will be implemented and tested as novel, interactive VR environments. Building upon preliminary results that successfully characterized movement artifacts and decoded cognitive workload in interactive VR using active-wet EEG sensors, this work will further explore the practical integration of EEG sensors with room-scale VR headsets to balance data quality, cognitive decoding performance, ease of setup and use, and user comfort.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1944389
Program Officer
Balakrishnan Prabhakaran
Project Start
Project End
Budget Start
2019-10-01
Budget End
2021-09-30
Support Year
Fiscal Year
2019
Total Cost
$209,996
Indirect Cost
Name
Virginia Commonwealth University
Department
Type
DUNS #
City
Richmond
State
VA
Country
United States
Zip Code
23298