The holy grail for a 3D display is to produce a scene that, to our eyes, is indistinguishable from reality. Despite significant advances in display technology, simultaneous deception of all perceptual depth cues is still beyond the reach of most displays. As a consequence, existing 3D displays, especially those used in Augmented / Virtual Reality (AR/VR) applications, tend to cause discomfort due to subtle differences between the real and the virtual world. This project aims to develop novel 3D photorealistic display designs that ensure that the light field incident on the eye mimics reality at a significantly higher level of fidelity as compared to existing designs. The development of such a technology stands to impact a broad range of disciplines and applications that go beyond AR/VR systems and applications, such as ophthalmology, and testing of optical systems. Further, as the visual realism of displayed content gets better, they stand to replace standard 2D monitors that are ubiquitous in work and home environments today. These photorealistic displays will have the potential to make AR/VR systems more accessible for visually impaired people. The education and outreach components of this project disseminates display research and demos in middle and high schools in the greater Pittsburgh region via lab visits, and hands-on workshops.

To achieve the goal of photorealistic AR/VR displays, this project explores differ approaches for modeling of the virtual world and focuses on the development of three distinct approaches that progressively increase realism at the cost of complexity: (i) simple models in form of depth and texture decomposition; (ii) light field representations that provide a dense sampling of the ray space; (iii) coherent wavefronts using Fourier optics. All the three approaches aim to produce light fields that are dense in spatial and angular resolution even though the underlying physical mechanisms rely on different models for light transport. The project will develop techniques for generating content that is easily adapted to the requirements of the display. The project also explores and characterizes the fundamental properties and features in terms of achievable spatial and angular resolutions, the precision in depiction of occlusion, the size of the eye-box, and finally their limitations.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
2008464
Program Officer
Balakrishnan Prabhakaran
Project Start
Project End
Budget Start
2020-08-01
Budget End
2023-07-31
Support Year
Fiscal Year
2020
Total Cost
$500,000
Indirect Cost
Name
Carnegie-Mellon University
Department
Type
DUNS #
City
Pittsburgh
State
PA
Country
United States
Zip Code
15213