The research seeks to understand how neural networks after the information that passes through them. A unique aspect of the neural networks we study is the process of synaptogenesis. That is, in addition to the widely used adaptive process of associative synaptic modification, our studies also investigate the role of a biologically inspired process for creating and destroying connections. Like associative synaptic modification, the making and breaking of connections is also driven by the input environment so that it too is an adaptive modification principle. The first problem we solved was to make these two adaptive processes compatible. It is now our goal to create a quantitative theory of how networks governed by these two processed would grow and perform. As an example of performance measures, our studies use a form of relative entropy to quantify the statistical dependence of neural network encodings and entropies to measure average information loss. In this research, neural network transformations are studied in a variety of statistically defined but random environments. It is our strategy to extrapolate our results via the asymptotics afforded by the laws of large numbers and the central limit theorem. Unfortunately, our networks and simulations to date have been too small to avail ourselves of such asymptotics for many of the variables. It is this consideration that motivates us to enlarge our network simulations.
Showing the most recent 10 out of 292 publications