The ability to optimally interact and guide a process towards its specified goals depends upon the capacity to observe, model and manipulate the dynamics as reliably as possible. The nonlinear nature of most chemical processes presents challenging operational problems, compounded by uncertainties in dynamics and errors in data. Linear techniques of modeling, estimation and control pose many limitations for nonlinear systems operating over wide ranging conditions. This research aims to advance nonlinear process systems engineering by formulating Bayesian solutions to problems in probabilistic modeling, optimal state estimation and optimal control with the use of finite state Markov chains. A new and general methodology will be developed to build discrete-time finite-state Markov chains from trajectory models in state space. The Markov chain will model the temporal evolution of state pdfs due to any type of nonlinearity under stochastic excitation.

Optimal estimation of states from noisy data is a critical task in systems engineering. Existing methods use linear-Gaussian assumptions to pose tractable least squares optimization problems, but fail to utilize the nonlinear data efficiently. Recursive methods based on extensions of the Kalman filter to nonlinear systems tend to diverge due to the failure of recursive relationships on the summary statistics of non-Gaussian pdfs. A general probability density based solution, called the cell filter, will be developed for recursive Bayesian estimation using the Markov modeling approach in discrete space. By handling non-Gaussian process and measurement errors, process constraints and non-additive errors, the cell filter will provide the most general estimation strategy for a wide class of nonlinear processes. Nonlinear optimal control strategies will be developed by posing the problem in discrete space. A novel cell iterative dynamic programming approach is proposed for nonlinear optimization. The combination of Markov models of probabilistic dynamics, Bayesian estimation and dynamic programming can efficiently solve online optimization for nonlinear MPC.

Intellectual Merit: The research advances the understanding of nonlinear process systems engineering at a fundamental level. The use of the pdf as the state of a process and linear operators on pdfs as process models is an extension of state space thinking. The merit lies in the fact that the Bayesian solutions not only generalize existing methods, but alleviate many of the practical difficulties associated with traditional nonlinear optimization based solutions. These ideas are substantiated with preliminary simulation results, which will pave the way for demonstrations on real processes. A CSTR case study shows that the methodology in all three areas is more accurate than existing methods and the computational load is reasonable for low to moderate dimensional systems.

Broad Impact: Advanced in nonlinear process operations will be incorporated in elective courses in advanced control and mathematics. Collaborations with the automation industry will quickly bring the results into practice. Maintaining a Linux cluster for parallel computing improves the computational infrastructure of the organization. The broad societal impact comes from the application of Bayesian methods for improving the efficiency of US process industry. Recent commercial successes of the Bayesian approach in financial, pharmaceutical and software fields indicate the timeliness of this plan.

Project Start
Project End
Budget Start
2005-08-15
Budget End
2008-07-31
Support Year
Fiscal Year
2005
Total Cost
$108,739
Indirect Cost
Name
Cleveland State University
Department
Type
DUNS #
City
Cleveland
State
OH
Country
United States
Zip Code
44115