This research involves understanding the computational complexity of fundamental machine learning problems. In particular, the investigators study which classic learning problems are unlikely to admit efficient solutions. In terms of broader impact, this line of research aids practitioners and algorithm designers as it outlines fundamental stumbling blocks for creating powerful learning systems. For example, can we reduce difficult open problems from cryptography (e.g., factoring) and complexity theory (e.g., NP-complete languages) to certain problems in machine learning? If so, this provides strong evidence that particular machine learning problems are hopelessly intractable. Another avenue of research is to prove unconditional lower bounds on the resources required to infer functions in restricted learning models.

The intellectual merit of the research lies in finding new reductions between problems in cryptography and complexity theory-- in particular communication complexity-- and problems from learning theory. For example, the PI studies the impact of lattice-based cryptography in machine learning and examines its implications for the DNF learning problem. Additionally, the PI researches the use of communication complexity to rule out learning simple concept classes via small sets of arbitrary features. We further delineate the role of Fourier analysis in proving lower bounds in the well known model of Statistical Query learning. Finally, this research investigates the relationships between NP-completeness and circuit complexity to general questions about proper learning and distribution-specific query learning.

Project Start
Project End
Budget Start
2007-09-01
Budget End
2012-02-29
Support Year
Fiscal Year
2007
Total Cost
$250,000
Indirect Cost
Name
University of Texas Austin
Department
Type
DUNS #
City
Austin
State
TX
Country
United States
Zip Code
78712