Augmented reality (AR) allows a viewer to visualize digital, virtual objects mixed into their real-world environment. Different from virtual reality, which displays virtual content but blocks out the view of the real world, AR enables mixtures such as a virtual representation of a remote person sitting in a real conference room with other people. Other examples of AR applications include: (i) science students visualizing the invisible flow of electricity moving through wires in their circuits or visualize unseen forces such as magnetism or atmospheric currents in their real-world environment; (ii) doctors overlaying an X-ray-like view of internal organs during exams or laparoscopic surgery. This project aims to answer a complex and important question: how does human visual system perceive the mix of virtual AR content and the real world? The researchers will build a model of perception of the visual characteristics of AR systems, including color, brightness, depth, and graphics quality, with data from experiments that test visual responses. The model will mimic visual adaptation to different lighting environments, such as indoors versus outdoors, as well as compensations for reflections and other interactions with the environment. The project will result in improved AR system designs, including responsive algorithms that react to changes in the environment and anticipate the visual perception of the user. It will also result in a better scientific understanding of the human visual system, which has the potential to improve many other visual interfaces in addition to AR. Integrating the project with education, the researchers will develop and test AR learning modules for university courses in science, math, and art, leading to improved learning. By improving the understanding of visual perception in AR, this project will help enable visually accurate, more comfortable, and more responsive AR display systems. These have the potential to enhance the visual sense in applications such as education, medicine, and transportation.
This project aims to build a robust computational model of visual appearance in AR systems that takes into account visual adaptation, cognitive interpretations, and the optical interaction between virtually displayed content and real objects and illumination in the environment. Visual experiments will employ psychophysical scaling, color matching via adjustment, and constant stimuli tasks to understand the influence of the luminance, color, contrast, complexity, and depth of both the AR virtual foreground and the real-world background. A model of color and material appearance in transparent AR environments will build on the working hypothesis that the perceived color is a non-physical addition of foreground and background whose weightings depend on cognitive discounting of the layers. Additional experiments will measure visual luminance and chromatic adaptation in temporally changing AR viewing environments, for instance, asking observers to adjust displayed AR and real-world stimuli to be achromatic in different lighting situations and at different adaptation times. The physical lighting environment, including the location, intensity, and color of light sources and bright objects, will be sensed with cameras and color sensors as input to responsive display algorithms utilizing the developed model of visual adaptation and appearance in AR. The responsive algorithms will ensure robust, predictable color appearance and display. During the project, several AR education applications incorporating the research results will be developed for validation and testing. Classroom assessments by students and faculty will evaluate these applications for validating the research results. AR will be adopted in the research group’s graduate color science courses as a tool for demonstrating adaptation and surround effects and as an environment for practicing psychophysical methods with experiments related to the ongoing research. The researchers will share their computational model and experimental findings via publication and professional organizations, with the goal of meaningful implementation both in AR system design and AR applications. Further, the impact of AR and the fascinating topic of color science will be shared with the community through the University’s open houses and the recruiting of students and faculty.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.