While computers have revolutionized our world, the resources required to perform computational tasks are poorly understood. Developing a mathematical theory of computation is crucial in our information age, where computers are involved in essentially every part of our life. Computational complexity, the study of the amount of resources needed to perform computational tasks, is essential for understanding the power of computation and for the development of a theory of computation. It is also essential in designing efficient communication protocols and secure cryptographic protocols, and in understanding human and machine learning. Proving lower bounds for the resources required by different computational models, and for different computational tasks, is among the most exciting, most challenging and most important topics in theoretical computer science. The project will study computational complexity lower bounds, focusing on two research directions: Lower bounds related to learning; and, studying the relative power of quantum and classical computational models.

In more details, we will focus on the following directions. (1) Memory-Samples lower bounds for learning: memory/samples lower bounds for online learning algorithms is a very exciting research direction that has been studied in a line of recent work. For example, it was proved that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples. The project will further study memory/samples lower bounds for learning and their relations to other topics in computational complexity. (2) The relative power of quantum and classical computational models: The project will study gaps between quantum and classical complexity in various models. In particular, it will investigate separations between quantum and classical communication complexity, as well as the relative power of quantum and classical algorithms in the context of memory/samples trade-offs for learning. (3) Circuit complexity lower bounds related to learning: The amazing success of machine learning, and in particular deep learning, motivates a study of closely related computational models. The project will study linear-threshold circuits and other models of circuits that are related to learning.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
2020-07-01
Budget End
2023-06-30
Support Year
Fiscal Year
2020
Total Cost
$450,000
Indirect Cost
Name
Princeton University
Department
Type
DUNS #
City
Princeton
State
NJ
Country
United States
Zip Code
08544