Proposals: DMS 9505199 PIs: Lee Jones and Yuly Makovoz Institution: University of Massachusetts at Lowell Title: APPROXIMATION, ESTIMATION, AND COMPUTATION PROPERTIES OF NEURAL NETWORKS AND RELATED PARSIMONIOUS MODELS Abstract: Artificial neural networks and related parsimonious models for function approximation and estimation have attracted recent attention in science and engineering. Work by the authors has uncovered several interesting aspects of these methods. Approximation bounds have been obtained by methods taken from the probability theory of empirical processes, including bounds on the average squared error and the maximal error of neural network and related approximations. These approximation bounds reveal a rate of convergence that is insensitive to the dimension of the input space for certain nonparametric (infinite dimensional) classes of functions, specified via the closure of convex hulls of finite dimensional families of functions. As a consequence accurate statistical estimation of functions in these nonparametric classes is possible without recourse to exponentially large sample sizes. Unfortunately, computation of neural net estimates can be an extremely difficult task. The investigators study how the problems of accurate approximation, estimation, and computation are intertwined. In this research they investigate fundamental mathematical, statistical, and computational limits of the capacity to approximate and to estimate these functions accurately by computationally feasible algorithms. Empirical modeling techniques used in a variety of scientific and engineering tasks deal with the problem of how to combine a large number of observable quantities to best predict or approximate a response variable. The input - response relation may be described by a rather complicated function, and it may be desirable to approximate it by a combination of a small number of elementary, comparatively s impler, functions. These models differ from classical techniques in approximation and statistical estimation in that the functions that are combined are not fixed in advance, but rather selected and adjusted according to what is known or observed concerning the intended response variable so as to provide the best fit. The investigators are quantifying the mathematical and statistical advantages of these adjustable selections. Artificial neural networks and related techniques are at the heart of modern models for adaptive and high performance computation. The investigators study the limits of what is computationally feasible with these models. The ubiquity of requirements for accurate prediction and empirical modeling for use of the scientific method in general and for nationally strategic topics in particular are motivating factors in this research.

Agency
National Science Foundation (NSF)
Institute
Division of Mathematical Sciences (DMS)
Type
Standard Grant (Standard)
Application #
9505199
Program Officer
Joseph M. Rosenblatt
Project Start
Project End
Budget Start
1995-07-01
Budget End
1999-06-30
Support Year
Fiscal Year
1995
Total Cost
$120,000
Indirect Cost
Name
University of Massachusetts Lowell
Department
Type
DUNS #
City
Lowell
State
MA
Country
United States
Zip Code
01854