Ji 9312504 This research will address an important open problem in supervised learning using feedforward multi-layer neural networks: how to evaluate generalization performance of the networks using network-parameter-dependent information. The focus is on large networks used for the purpose of function approximation. The objectives are: 1) to evaluate the generalization error utilizing the network-parameter-dependent information; 2) to incorporate dynamics of training algorithms into evaluation of generalization error; 3) to design network complexity in order to achieve more predictable generalization on a small training set for large networks with fixed architecture. To achieve these objectives, statistical methods will be used to develop a general relationship between the generalization error and the expected network complexity. And a Bayesian network is utilized to relate dynamics of training algorithms to the generalization performance of the networks. The impact of the proposed research is to important open problems in combined supervised and reinforcement learning, and high-speed high-capacity large optical neural networks. *** v s t APPS HLP @ j N= ARGYLE BMP @ j o v CARS BMP @ j p v CHARMAP EXE @ j q V CHORD WAV @ j | a EXPAND EXE @ j ; FLOCK BMP @ j ^ GLOSSARYHLP @ j MPLAYER HLP @ j `2 NETWORKSWRI @ j NOTEPAD EXE @ j Ji 9312504 This research will address an important open problem in supervised learning using feedforward mul ti-layer neural ne $ $ ( F / / 1 Courier Symbol & Arial 5 Courier New " h % % * V R:WW20USERABSTRACT.DOT ji abstract Alicia E. Harris Alicia E. Harris