The ability to interact remotely over the Internet is redefining the nature of collaboration. Many collaborative activities require coordinating attention and action with another person moment-by-moment; without the benefit of being physically present with another person these sorts of collaborations are difficult to conduct efficiently. This project explores using human eye gaze to create partner models for mediating time-critical collaborative activities. A partner model is a dynamically learned description of what a partner is trying to do - for example, what someone may be looking for, or what they consider to be relevant within a task.

Intellectual Merit: Many tasks and events are implicit or poorly defined, requiring that partner models be learned from evidence unfolding as part of a person's ongoing behavior. Eye trackers will be used to determine the task-relevant objects that a person chooses to look at (and not look at); through analysis of these gaze patterns and the properties of the objects, human and computer partners will learn a model of what this person is attempting to do. Various tasks will be explored, such as searching for a new and/or ambiguous moving target specified only by incomplete semantic descriptions, or monitoring a complex dynamic environment for unusual events, defined by atypical target movements and relationships between people and objects. The findings will advance the fields of human-computer interaction, psycholinguistics, artificial intelligence, object and event detection by humans and computers, and multimodal human communication.

Broader Impacts: The results of the project will facilitate the development of new tools that can help people with their tasks, by, for example, finding and highlighting objects in a scene that match the viewer's goals and helping the viewer track moving targets. The results will also lead to new tools for remote collaboration, with the goal being to make coordination at a distance as efficient as face-to-face interaction. The tools and techniques from the project are expected to benefit a variety of applications, including the development of assistive technologies for people with communication impairments and the creation of better security screening procedures. The project will provide training and research experiences for Stony Brook University's racially, ethnically, and economically diverse students, including women and others underrepresented in science and engineering.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
1111047
Program Officer
William Bainbridge
Project Start
Project End
Budget Start
2011-08-01
Budget End
2015-07-31
Support Year
Fiscal Year
2011
Total Cost
$749,999
Indirect Cost
Name
State University New York Stony Brook
Department
Type
DUNS #
City
Stony Brook
State
NY
Country
United States
Zip Code
11794