Deep convolutional neural networks are a class of mathematical models that provide a variety of machine learning tools with impressive success, often obtaining state-of-the-art results across different fields. Yet, their theoretical understanding and the fundamental ideas behind these algorithms have remained elusive. These questions are essential to recognize and characterize their limitations, to provide guarantees for their performance, and even to develop and engineer improved practical models. A promising approach to obtain this understanding is to make assumptions about the class of samples on which these models are deployed (e.g., so that these are "simple enough") with the intention of providing theoretical insights about them. Further understanding of this 'multi-layered convolutional sparse model' is what this project seeks accomplish, broadening the understanding of its related optimization and learning problems, and shedding light on deep learning methodologies.

This project proposes to advance the state of the art in generalized sparse models of different numbers of layers, focusing on both inference and learning problems. Provable and efficient optimization methods will be derived for the inverse problems associated with multilayer sparse models by relying on new results in proximal gradient and subgradient descent methods. This proposal will further extend the formulation of the pursuit to other settings, increasing stability and robustness to the choice of parameters and to outliers. Furthermore, efficient algorithms for the corresponding unsupervised learning problem will be proposed and analyzed. Questions of sample complexity and generalization bounds will in turn be studied in supervised learning settings. Throughout this project, the resulting algorithms will be studied in terms of their relation to specific convolutional network architectures. The project brings together combined expertise in signal processing, dictionary learning, machine learning, and the design, analysis and implementation of optimization methods for large-scale problems.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
2020-07-01
Budget End
2023-06-30
Support Year
Fiscal Year
2020
Total Cost
$281,251
Indirect Cost
Name
Johns Hopkins University
Department
Type
DUNS #
City
Baltimore
State
MD
Country
United States
Zip Code
21218