Computational complexity theory is a mathematical field that studies the limits of computers and the resources needed to perform computational tasks. A mathematical theory of computation is crucial in our information age, where computers are involved in essentially every part of our life. Computational complexity is also essential in designing efficient communication protocols, secure cryptographic protocols and in understanding human and machine learning. Studying the limits of computational models is among the most exciting, most challenging, and most important topics in theoretical computer science and is essential for understanding the power of computation and for the development of a theory of computation.

The project will study lower bounds for the resources required by different computational models, as well as related topics in computational complexity. The project will focus on three main research directions:

Time-Space lower bounds for learning: In a sequence of recent works, the PI and his coauthors proved that some of the most extensively studied learning problems require either a super-linear memory size or a super-polynomial number of samples. The project will further study memory/samples lower bounds for learning and their relations to other topics in complexity theory. Lower bounds for learning under memory constraints demonstrate the importance of memory in learning and cognitive processes. They may be relevant to understanding human learning and may have impact on machine learning, artificial intelligence and optimization. They also have applications in cryptography.

Lower bounds for arithmetic circuits: The project will study lower bounds for arithmetic circuits and formulas, as well as for subclasses of arithmetic circuits and formulas. Lower bounds for arithmetic circuits may have a broader impact within theoretical computer science, because of the centrality of polynomials in theoretical computer science.

Lower bounds for communication complexity: In a sequence of recent works, the PI and his coauthors proved the first gaps between information complexity and communication complexity. These results show that compression of interactive communication protocols to their information content is not possible, and hence show that interactive analogs to Shannon's source coding theorem and Huffman coding are not possible. Separation results of communication complexity and information complexity may be relevant to electrical engineering and in particular to the design of efficient communication protocols. The project will further study these topics, and more generally, lower bounds for communication complexity and their relations to other topics in computational complexity.

Project Start
Project End
Budget Start
2017-09-01
Budget End
2021-08-31
Support Year
Fiscal Year
2017
Total Cost
$450,000
Indirect Cost
Name
Princeton University
Department
Type
DUNS #
City
Princeton
State
NJ
Country
United States
Zip Code
08544