9616391 Olurotimi Recurrent neural networks (RNN) are nonlinear dynamical filters, and it has been shown by J. Lo that they are capable of converging to the minimum variance filter for a signal process. However, unlike conventional filtering techniques that utilize top-down, parameteric design to realize the filters, the RNN approach is a data-driven synthesis approach. the neural network approach is justified by several universal approximation theorems that ensure that the neural network form is theoretically sufficient for implementing these tasks. However, one feature of neural network design familiar to designers and users alike is that may different (e.g. in weights) networks can be constructed to solve the same problem. This project will employ novel concepts and quantitative results on the behavior of RNN's in noise in order to address this problem. Recognizing the important existence results of Lo, and using recent results of Olurotimi and Das, the PI develops a modified training measure. The resulting ordered derivatives training scheme of Werbos the searches not for must any optimum weight set, but for the restricted class of optimum weight sets that also increase the estimator efficiency. The proposed research will result in s design scheme expected to significantly reduce the design time of nonlinear RNN filters.

Project Start
Project End
Budget Start
1996-09-01
Budget End
1998-05-31
Support Year
Fiscal Year
1996
Total Cost
$49,568
Indirect Cost
Name
George Mason University
Department
Type
DUNS #
City
Fairfax
State
VA
Country
United States
Zip Code
22030