The control engineer is often confronted with the task of designing controllers for noisy systems whose parameter variation is so severe that classical or state space based controllers cannot handle it. This problem provides the material of Adaptive Control Theory. There are several aspects to Adaptive Control Theory. The design problem is to produce algorithms that will adjust control settings in real time in response to plant variations. The analysis problem is to study the behavior and performance of such algorithms. Analysis may (and has) lead(led) to design changes. Analysis involves questions of global stability particularly in the presence of unmodeled dynamics and performance (e.g. attained tracking error power and required control power or effort). The aim of the work proposed here is to look at these issues in a stochastic setting. To design new algorithms it seems useful to consider modelling the parameter variation, but this immediately raises a dimensionality problem; a resolution of this based on Factor Analysis ideas is suggested. Next is the issue most basic to Adaptive Control, namely stability. Three stochastic tools available to deal with this are reviewed and likely successful avenues explored. The way in which the unmodeled dynamics problem impinges is also assessed. Finally the performance issues can be tackled in a manner developed in Adaptive Signal Processing through the idea of misadjustment and excess lag. For good technical reasons, the deterministic and Stochastic Adaptive Control developments have hitherto been somewhat split, it may now be possible to begin to bring them together.