Neural networks have gained much publicity in the scientific and popular press as a new method of computation that is superior to standard techniques. Early successes with small problem instances have generated optimistic conjectures that neural network solutions to computationally difficult tasks will scale to real-world problems. The project focuses on the development and application of techniques in computational complexity, in order to bring more rigor into the study of the scalability of neural networks to large problems. In addition, this research focuses on quantifying both their computational abilities and their capacity to learn from experience. One objective is the formal analysis of neural networks whose performance has hitherto only been guessed at from experimental evidence on small or randomly generated problems. Specific subjects to be investigated include: (1) upper and lower bounds on the weights of linear threshold functions, (2) the complexity of popular Hopfield network algorithms, (3) the complexity of the loading problem for analog neural networks with a fixed number of nodes, (4) mistake bounds for variants of the perception learning algorithm.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Communication Foundations (CCF)
Application #
9302917
Program Officer
Yechezkel Zalcstein
Project Start
Project End
Budget Start
1993-09-01
Budget End
1998-08-31
Support Year
Fiscal Year
1993
Total Cost
$102,617
Indirect Cost
Name
University of North Texas
Department
Type
DUNS #
City
Denton
State
TX
Country
United States
Zip Code
76203