This research concerns several basic problems in the area of stochastic adaptive estimation and control. The adaptive control problem (with estimation incorporated) is that of designing a feedback controller (algorithm) which minimizes a given performance criterion or tracks some given reference signal when the process model contains unknown (due to modeling difficulties) or changing (due to wear, temperature variations, etc.) parameters. The adaptive control algorithm in the feedback loop attempts to actively reduce the parameter uncertainty by performing on line parameter estimation and then using these updated estimates to continuously reconfigure or retune the controller. The proposed research represents innovative approaches toward the adaptive estimation and control of systems which are nonlinear and/or involve incomplete state observations. The problems being considered are more difficult than problems solved to date. In particular, this research should result in a deep understanding of the adaptive estimation and control of Markov chains with incomplete observations; such problems arise in the control of computer communication networks, as well as in quality control, maintenance, replacement, and repair of industrial processes. A thorough investigation of the adaptive control of stochastic bilinear systems is also proposed; such dynamic systems contain terms which are bilinear in the state and the control. Professor Marcus is well known for his work on stochastic control. The University of Texas at Austin provides a fine research environment for this project. This award renews research previously supported under NSF Grant ECS-8412100.

Project Start
Project End
Budget Start
1987-09-01
Budget End
1991-08-31
Support Year
Fiscal Year
1986
Total Cost
$182,722
Indirect Cost
Name
University of Texas Austin
Department
Type
DUNS #
City
Austin
State
TX
Country
United States
Zip Code
78712