An integrated eye-tracking head-mounted display (ET-HMD) able to display stereoscopic virtual images while also tracking the direction of the user's gaze would benefit both fundamental scientific research and a host of emerging applications. Yet despite significant advances in and commercial availability of stand-alone HMD and eye-tracking technologies, a portable, lightweight, accurate and robust system that conforms to the form factor of a pair of sunglasses remains elusive. The PI's goal in this project is to develop a fundamentally new optical technology that will make it possible to realize this dream by overcoming the limitations of the conventional methods that have been applied to ET-HMD designs to date. In most previous work on designing ET-HMD systems, the optical systems for the HMD and eye-tracking paths were treated separately, and rotationally symmetric optical surfaces were used. In contrast, the PI's approach is to utilize freeform optical technology in combination with an innovative optical scheme that uniquely combines the display optics with the eye imaging optics. To these ends, she will investigate the challenges of designing the freeform ET-HMD system and develop appropriate design methods along with an optimization strategy for the required high performance optical system. She will implement a fully integrated, portable prototype system, and develop calibration and assessment methods for evaluating both the display and eye-tracking performance. As a testbed application for the new technology, the PI will evaluate the feasibility of adopting ET-HMD displays to augment communication by patients suffering from ALS (amyotrophic lateral sclerosis) or similar neurological disease that causes loss of speech. Project outcomes will include the first lightweight and portable ET-HMD display prototype that has a form factor close to that of sunglasses.

Broader Impacts: The new technology resulting from this project will have critical impacts in many fields. It will create a revolutionary and intriguing platform for mobile communication, wearable computing, and portable entertainment. It will provide a new tool for research related to vision and human factors, where eye movements provide a sensible metric for understanding human perception in 3D space and effectiveness at specific tasks. It will afford augmentative and alternative human-computer interfaces for people with proprioceptive disabilities or with situational impairments due to having their hands and feet occupied. And it will undoubtedly spur development of a host of new and thrilling applications. The research will also provide a vehicle for training the next generation of interdisciplinary scientists and engineers, at both the undergraduate and graduate levels.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1115489
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2011-08-15
Budget End
2015-07-31
Support Year
Fiscal Year
2011
Total Cost
$499,954
Indirect Cost
Name
University of Arizona
Department
Type
DUNS #
City
Tucson
State
AZ
Country
United States
Zip Code
85719