Mobile devices, such as smart phones, are being increasingly utilized for watching videos, since they can be conveniently used for this purpose anywhere anytime, such as commuting on a subway or train, sitting in a waiting room, or lounging at home. Due to the large data size and intensive computation, video processing requires frequent memory access that consumes a large amount of power, limiting battery life and frustrating mobile users. On one hand, memory designers are focusing on hardware-level power-optimization techniques without considering how hardware performance influences viewers' actual experience. On the other hand, the human visual system is limited in its ability to detect subtle degradations in image quality; for example, under conditions of high ambient illumination, such as outdoors in direct sunlight, the veiling luminance (i.e., glare) on the screen of a mobile device can effectively mask imperfections in the image, so that under these circumstances a video can be rendered in lower than full quality without the viewer being able to detect any difference. This isolation between hardware design and viewer experience significantly increases hardware implementation overhead due to overly pessimistic design margins. This project integrates viewer-awareness and hardware adaptation to achieve power optimization without degrading video quality, as perceived by users. The results of this project will impact both basic research on hardware design and human vision, and provide critical viewer awareness data from human subjects, which can be used to engineer better video rendering for increased battery life on mobile devices. The project will directly involve undergraduate and graduate students, including females and Native Americans, in interdisciplinary research.

Developing a viewer-aware mobile video-memory solution has proven to be a very challenging problem due to (i) complex existing viewer-experience models; (ii) memory modules without runtime adaptation; and (iii) the difficulty of viewer-experience analysis for hardware designers. This project addresses the problem by (i) focusing on the most influential viewing-context factor impacting viewer experience - ambient luminance; (ii) proposing novel methodologies for adaptive hardware design; and (iii) integrating a unique combination of expertise from the investigators, ranging from psychology to Integrated Circuit design and embedded systems. Specifically, this project will (i) experimentally and mathematically connect viewer experience, ambient illuminance, and memory performance; (ii) develop energy-quality adaptive hardware that can adjust memory usage based on ambient luminance so as to reduce power usage without impacting viewer experience; and (iii) design a mobile video system to fully evaluate the effectiveness of the developed methodologies.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
2018-10-01
Budget End
2018-11-30
Support Year
Fiscal Year
2018
Total Cost
$300,000
Indirect Cost
Name
North Dakota State University Fargo
Department
Type
DUNS #
City
Fargo
State
ND
Country
United States
Zip Code
58108