In his 1948 paper that founded information theory, Shannon introduced a notion of "entropy" to measure the amount of "randomness" in a process. However, to a computer with bounded resources, the amount of randomness can appear to be very different from the Shannon entropy. Indeed, various measures of "computational entropy" have been very useful in computational complexity and the foundations of cryptography. Recent work by the PI and others have introduced new measures of computational entropy, increased our understanding the new and old computational entropy measures, and found greater applicability of the these measures in cryptography and complexity theory.

This project aims to refine our understanding of computational entropy, use computational entropy to seek unified and optimal constructions of cryptographic primitives, bring us closer to resolving the power of randomness in space-bounded computation, and identify new applications of computational entropy in the theory of computation.

This research is closely integrated with the PI's educational efforts. The PI continues to develop courses and online lecture notes related to the project (on topics such as Pseudorandomness, Cryptography, and Applied Algebra). Graduate students are involved in all aspects of the research, and undergraduates participate regularly as well. The PI is also very active in service to the scientific community, including outreach efforts such as "vision nuggets" that convey research directions in theoretical computer science to a broad audience.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Communication Foundations (CCF)
Type
Standard Grant (Standard)
Application #
1116616
Program Officer
Jack S. Snoeyink
Project Start
Project End
Budget Start
2011-07-01
Budget End
2015-06-30
Support Year
Fiscal Year
2011
Total Cost
$450,000
Indirect Cost
Name
Harvard University
Department
Type
DUNS #
City
Cambridge
State
MA
Country
United States
Zip Code
02138