An augmented reality (AR) system allows for virtual objects to be overlaid visually in physical spaces through the use of AR glasses or through the camera/screen of a mobile device. This provides rich interactive experiences for education, remote guidance for workforce training, immersive manuals for do-it-yourself home construction, construction industry monitoring, interior design envisioning for consumer purchases, and entertaining games, to name just a few areas. However, current AR systems suffer from high energy consumption and limited performance due to the high data rates associated with visual computing with high image frame resolutions and high frame rates. The proposed project aims to reduce the sensing data rate of visual computing, enabling more compact augmented reality devices with smaller battery sizes, and higher precision placement of virtual objects in the physical spaces.

To this end, the project involves a redesign of the visual computing hardware and software systems around processing and producing image pixel regions with varying spatial resolutions and temporal intervals, selectively guided by the augmented reality software needs. To this end, the project will introduce new methods to design and characterize adaptive sensing architectures. This will include the investigation of hardware architecture and software framework patterns to provide application control of sensor operation. In addition, the proposed project advances K-12 student engagement and undergraduate and graduate student engagement in science, technology, engineering and math, engaging students through the development and demonstration of an educational augmented reality experience describing image sensor system operation in the context of computer vision and augmented reality applications.

The project research produces system software, applications, and experimental testbenches. These will include source code files and input data to test the system, including images and videos. The project provides detailed instructions on how to reproduce all steps of the experiments. The project also provides experimental results to describe power consumption, performance, task accuracy, and other characterization data. These are hosted on a software repository site, and linked through an accessible webpage: http://meteor.ame.asu.edu/rhythmicpixelregions. The repository and the webpage will remain active for 5 years after the end of the project, or 5 years after publication, whichever is later.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Application #
1942844
Program Officer
Erik Brunvand
Project Start
Project End
Budget Start
2020-04-01
Budget End
2025-03-31
Support Year
Fiscal Year
2019
Total Cost
$166,442
Indirect Cost
Name
Arizona State University
Department
Type
DUNS #
City
Tempe
State
AZ
Country
United States
Zip Code
85281