Brain-computer interfaces (BCI) have made dramatic progress in recent years. Their main application to date has been for the physically disabled population, where they typically serve as the sole input means. Recent results on the real-time measurement and machine learning classification of functional near infrared spectroscopy (fNIRS) brain data lead to this project, in which the PI and his team will develop and evaluate brain measurement technology as input to adaptable user interfaces for the larger population. In this case, brain input is used as a way to obtain more information about the user and their context in an effortless and direct way from their brain activity, which is then used to adapt the user interface in real time. To accomplish this a multi-modal dual task interface between humans and robots will be introduced, which will serve as a particularly sensitive testbed for evaluating the efficacy of these new interfaces.

The project will create and study these new user interfaces in domains where the effect on task performance of introducing the brain input to the interface can be measured objectively. They are most useful in demanding, high-performance, multitasking situations. Carefully calibrated multitasking applications scenarios from the team's research in Human-Robot Interaction will be employed.

The project will also advance the range of fNIRS brain measurements that can be applied to user interfaces. It will study a recently identified fNIRS signal obtained from the phase relationships among different regions of the scalp at low frequencies (0.1 Hz), as well as a wider range of sensor placement locations than previously examined. As these are developed into usable measurements for real-time signals with machine learning and other analysis approaches, they will be incorporated into new user interfaces.

Broader Impacts: The target of the research is adaptive interfaces for non-disabled users, where brain measurement is an additional source of user input. However, as the work proceeds toward making this into a more robust technology project outcomes will have promise for physically-challenged users, and ultimately they promise to improve the lives of people with severe motor disabilities.

Project Report

This project has helped to bring brain-computer interfaces into mainstream human-computer interaction on several fronts. It has shown the way toward the more widespread use of passive or "implicit" brain-computer interfaces as a way to augment user interaction for non-disabled users performing conventional computer tasks. Previously, brain-computer interfaces have been used primarily for disabled users or in fairly exotic experimental tasks. We have shown how they can be used in realistic scenarios for both highly skilled and relatively common application scenarios. We have conducted experiments on them, and published the results. In each case, we have shown not just that our system functions correctly, classifies brain signals correctly, or is preferred by the user -- but also that it results in an objectively measurable improvement in performance of a realistic task. We have produced such results in a variety of areas including UAV (remote airplane) control, movie preference and recommendation, cursor movement (bubble cursor), information filtering, human-robot interaction, and musical improvisation. We concluded the project by publishing a more comprehensive paper in ACM Transactions on Computer-Human Interaction, which ties together our work in this area and introduces a theoretical framework for it. We began the development of a inexpensive and portable wireless optical brain measurement device, which will continue independently beyond the term of this project. This promises to make the type of interfaces we are developing cheaper and more widely available in the long term. Finally, the project has brought a range of students from undergraduate to Ph.D. into the lab and exposed them to the process of research in human-computer interaction. Several of these Ph.D. students are now faculty members at other universities.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
1065154
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2011-06-01
Budget End
2015-05-31
Support Year
Fiscal Year
2010
Total Cost
$935,524
Indirect Cost
Name
Tufts University
Department
Type
DUNS #
City
Boston
State
MA
Country
United States
Zip Code
02111