In this project the PI will develop the science needed to enhance mobile augmented reality applications with (a) eye tracking and (b) gaze direction. Mobile augmented reality uses a hand-held device to provide a see-through view of the physical world in which an image of the physical world is superimposed with information about the things in that view. It is as if you held a piece of glass up to the world, and text appeared on that piece of glass labeling the things you see through the glass. This project will investigate how mobile eye tracking, which monitors where a person is looking while on the go, can be used to determine what objects in a visual scene a person is interested in, and thus might like to have annotated in their augmented reality view. This project will investigate how to make these scene annotations appear and disappear in a manner that is neither distracting nor obtrusive by conducting experiments that measure a person's ability to accomplish visual tasks while presented with text annotations in different sizes, transparency levels, and distances from the point-of-gaze (the point where a person is looking). The project will develop algorithms to automatically manage the density and placement of these labels to best support human tasks while avoiding the creation of distracting "visual clutter".

The project will also develop the science that is needed to direct a person's gaze in the physical world by means of new visualization techniques for use in mobile augmented reality systems. In this case, it is as if the piece of glass through which you are viewing the world periodically changed slightly to unobtrusively motivate you to look more closely at different, specific, task-relevant parts of the scene. This aspect of the project will be conducted in collaboration with the Houston Museum of Fine Arts.

Broader Impacts: The project will advance discovery of visualization techniques to permit mobile applications to enhance the viewing of the physical world, while promoting the teaching and learning of science in multiple contexts. Collaborating with the Houston Museum of Fine Arts, the project will develop a freely-downloadable iPhone application that will enhance museum-goers' learning in the arts, and provide a proof-of-concept of how the techniques developed in this project could be used in other contexts.

The project pursues a number of specific new opportunities in science education, including the development of (a) project-related curriculum for a science-based summer camp for junior high school students at Texas A&M University, (b) new University-level courses on the programming for augmented reality and the human performance aspects of lighting and cinematography, and (c) conference tutorials on experimental design, eye tracking, and perception in computer graphics.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
1253432
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2013-02-01
Budget End
2020-01-31
Support Year
Fiscal Year
2012
Total Cost
$537,862
Indirect Cost
Name
Texas A&M University
Department
Type
DUNS #
City
College Station
State
TX
Country
United States
Zip Code
77845