This project will investigate a novel approach to model and design learning algorithms in the small sample case using entropy. It will seek a deeper mathematical understanding of the new method and its relations to the mean square error, still today the workhorse of learning. The PI will investigate the properties of the information field created by the information particles, and how to use it to improve generalization and construct bounds on system errors. He will depart from the static fields provided by statistics and incorporate time in the information forces to mimic the interactions in diffusion fields and oscillating fields. This will open the door to adapt systems in space-time. He will also study the robustness of entropy for parameter estimation and compare performance with SVMs. But the large appeal of the novel pair wise interaction model for learning is that it opens drastically new opportunities to understand, apply and implement adaptive systems. He will be studying the following applications of the pair wise interaction model:
1- Model the information processing in the dendritic tree as an spatio-temporal interaction field that estimates entropy. Seek on-line algorithms and hopefully show their relation to Hebbian. Our preliminary results are very encouraging. Dr. Henry Markram, a world expert in dynamic synapses will help establish the biological plausibility of the model.
2- Formulate state space models of linear and nonlinear systems with our entropy estimator. This work will extend the field of minimum entropy control for non-Gaussian processes and will pro-vide a new approach to implement robust control.
3- Apply the entropy estimator to blind deconvolution of wireless channels as well as multiuser detection. The PI will extend his current work on blind source separation for convolutive mixtures, the more realistic (but also harder) case. In particular he will target the cocktail party effect in teleconferencing.