Information theory provides the mathematical underpinning on how to optimally communicate, compress and process information. The underlying algorithms, including codes for communication, have been invented using human ingenuity and proved to be optimal using deep mathematical reasoning. The rise of deep-learning methods, a branch of machine learning, presents an opportunity to revisit the paradigm for discovering new algorithms. This project studies how to utilize deep-learning for accelerating algorithm discovery in long-standing information-theoretic problems. On a broader scale, this project will promote a stronger interface between deep-learning and information theory, benefiting both research communities. The outcomes of this project will also help create a theory explaining the gains derived by deep-learning algorithms in many application domains, thus addressing an important scientific gap in our understanding of machine-learning.

This project studies three problems at the interface of deep-learning and information theory. (1) Deep-Learning based Code Design: This thrust studies the problem of code design, where the code is used to tackle the noise in the communication medium. This involves first replicating the previous successes in code design using this new paradigm, as well as inventing novel codes in unsolved problems -- requiring novel network architectures that can have application beyond codes. (2) Statistical Property Testing with Deep-Learning: How deep-learning can help in information estimation (such as mutual-information estimation) and statistical property testing (such as independence testing) will be studied. Solutions to high-dimensional property testing will require novel p-value guarantees, which will be explored. (3) Information-theoretic Underpinnings of Deep-Learning: The fundamental information theoretic principles underlying deep-learning, such as the sample complexity and optimal training algorithms for recurrent neural networks will be studied.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
University of Washington
United States
Zip Code