Virtual Reality (VR) and Augmented Reality (AR) applications are projected to be the next wave of "Killer Apps" in the future Internet. VR/AR applications facilitate vivid immersive virtual and augmented reality experience and create tremendous new opportunities in many domains, including education, business, healthcare, and entertainment, etc. Many VR/AR applications involve streaming of 360-degree video scenes. Compared with the traditional video streaming, 360-degree video streaming requires much higher network bandwidth and much lower packet delivery latency, and user's quality of experience is highly sensitive to the dynamics in both network environment and user viewing behaviors. Addressing these unique challenges, this project will develop novel 360-degree video coding and delivery solutions to enable high quality interactive, on-demand, and live video streaming.

The project includes several research thrusts to enable novel joint coding-and-delivery solutions for high quality and robust 360-degree video streaming. For interactive streaming, novel Field-of-View (FoV) adaptive coding structure will be designed to achieve low encoding and decoding latency. Realtime joint optimization of streaming rate adaption and video coding bits allocation based on the predicted FoV will be studied to maximize the rendered video quality. For on-demand streaming, a two-tier video coding and delivery framework will be developed, and the rate allocation and video chunk scheduling between the two tiers will be investigated to strike the desired balance between the rendered video quality and streaming robustness. To facilitate predictive coding and delivery, the project will develop effective algorithms for predicting user FoVs, based on the past FoV trajectory and the audio and visual content through deep learning architectures. Personalized FoV prediction based on other users' view trajectories will also be explored under the framework of recommender systems. Fully-functional 360 video streaming prototypes will be developed and tested in controlled and real network environments to validate and improve the new designs. If successful, the research will lead to new theory and designs for 360-degree video coding and help enable the wide-spread deployment of high-quality and robust 360-video streaming systems. The research findings will be made available through publications, talks, open protocols, and open-source codes, allowing a multitude of developers, researchers, and companies to evolve 360-video streaming. The project will also create valuable research opportunities for graduate and undergraduate students, especially women and minority students. Interactions with industry will be facilitated through workshops and several research centers at the New York University.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Type
Standard Grant (Standard)
Application #
1816500
Program Officer
Darleen Fisher
Project Start
Project End
Budget Start
2018-10-01
Budget End
2021-09-30
Support Year
Fiscal Year
2018
Total Cost
$499,901
Indirect Cost
Name
New York University
Department
Type
DUNS #
City
New York
State
NY
Country
United States
Zip Code
10012