Today’s cameras provide a digital window into the real world with broad applications across societal and scientific areas. Despite their remarkably diverse applications, existing cameras are engineered as general-purpose sensing and signal-processing pipelines. This project breaks with this conventional approach and proposes methods to “computationally evolve” the cameras of tomorrow. As such, the results will drastically expand our understanding of how to develop and optimize entire cameras and signal processing chain for a specific application domain, including medical imaging, robotics, scientific imaging, virtual/artificial reality, and self-driving vehicles. We will develop a completely new breed of cameras for these diverse application domains, for example, ones that may consider the scene as part of the camera. The research efforts are tightly integrated with an outreach program that introduces underrepresented and at-risk students in the New Jersey and New York area to science and technology through domain-specific cameras for self-driving vehicles.

The research of this project will develop a novel comprehensive learning framework that allows the researchers to reason over a distribution of cameras in a continuous and differentiable fashion. This framework will hinge on a theory that models the illumination, acquisition, processing, scene light transport, and illumination stages of a broad space of cameras. As such, the research team will be able to optimize over the architecture and parameters of full sensing and processing stacks, resulting in fundamentally novel cameras tailored to specific imaging and perception tasks. These new cameras learn to shift complexity between optics, compute, illumination, sensing, and the scene light transport, exploiting the scene and scene semantics as part of the imaging process. This enables unprecedented capabilities for domain-specific imaging in scattering media, ultra-miniaturized learned cameras, neural optical compute, and imaging at ultra-large scales in adverse conditions and ultra-small scales, all of which will be explored in this project.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
2047359
Program Officer
Jie Yang
Project Start
Project End
Budget Start
2021-02-01
Budget End
2026-01-31
Support Year
Fiscal Year
2020
Total Cost
$100,000
Indirect Cost
Name
Princeton University
Department
Type
DUNS #
City
Princeton
State
NJ
Country
United States
Zip Code
08544