The objectives of this research are (a) to develop effective methods for optimizing the entropy of an efficient neural network model; (b) to study the effects of feedback on the design of an associative memory; and (c) to develop machines architecture for implementing the theoretical models. The model under study expands during adaptation; hence fundamental problems, such as determining a network topology and avoiding local minima of gradient descent procedures, do not exist. Hybird architectures emerging from the use of optical computing structures in conventional dataflow architectures are considered and refined for implementing the proposed neural network model. Effective training methods and architectures for implementation with existing technology are developed. Neural networks have been found to be effective in learning representations that cannot be represented formally. One major drawback of the various learning algorithms is that there is a tendency to stop once local minima are found. The model under study uses feedback links and expanding configuration, hence reducing such possibilities. Problems related to implementation and architectural designs are also addressed in this research. The project considers the design of an effective neural network model from both the theoretical and experimental points of view.

Project Start
Project End
Budget Start
1989-06-15
Budget End
1991-11-30
Support Year
Fiscal Year
1989
Total Cost
$74,000
Indirect Cost
Name
Tulane University
Department
Type
DUNS #
City
New Orleans
State
LA
Country
United States
Zip Code
70118