Seeing around a corner without using a mirror seems like an impossible feat, and an ordinary camera does not seem to help. In this project, the research team will develop systems to process images and videos from ordinary cameras to construct around-the-corner views and to infer other scene features that are hidden from direct view such as the numbers of people present. The team will also augment the use of ordinary cameras with depth sensors such as those used in self-driving cars and immersive gaming. Being able to see around corners has the potential to aid first responders and improve navigation safety.

The fundamental difficulty in using diffusely reflected light in imaging is that light from many directions is combined at the visible surface. Methods for non-line-of-sight imaging have predominantly depended on separating light paths by their lengths using expensive, high-resolution time-of-flight (TOF) measurement. Instead of depending entirely on TOF, this project emphasizes the exploitation of opaque objects that create restrictions and variations in how light is combined at a diffuse surface. The research team will also develop a framework to express inverse problem difficulty that goes beyond condition numbers to provide localized and directional concepts for resolution. Analyses will inspire and be informed by proof-of-concept experiments.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Communication Foundations (CCF)
Application #
1955864
Program Officer
Scott Acton
Project Start
Project End
Budget Start
2020-07-01
Budget End
2024-06-30
Support Year
Fiscal Year
2019
Total Cost
$148,588
Indirect Cost
Name
Massachusetts Institute of Technology
Department
Type
DUNS #
City
Cambridge
State
MA
Country
United States
Zip Code
02139