Optimal control of jump-linear gaussian systems†

This paper investigates the problem of controlling a discrete-time linear system with jump parameters. A review of the literature is presented as well as a development of the application of dynamic programming to this class of control problems. Dynamic programming has been applied by many researchers and it was observed that no closed-form analytical solution could be constructed because of the ‘dual’ aspects of the controller. The main contribution of the present work is an algorithm, suitable for computer implementation, for the optimal dual control. The construction of the algorithm is based on transforming the dynamic programming relations into a space of sufficient statistics and using a finite-dimensional optimization procedure to obtain the optimal control as a function of the statistics. This is achieved by first developing a suitable recursive realization of a ‘filter’ which generates the sufficient statistics for the problem and then embedding this filter into the dynamic programming equations. ...

[1]  R. E. Kalman,et al.  New Results in Linear Filtering and Prediction Theory , 1961 .

[2]  W. Wonham Stochastic problems in optimal control , 1963 .

[3]  Karl Johan Åström,et al.  Optimal control of Markov processes with incomplete state information , 1965 .

[4]  C. Striebel Sufficient statistics in the optimum control of stochastic systems , 1965 .

[5]  H. Kushner On the stochastic maximum principle: Fixed time of control , 1965 .

[6]  Robert E. Kalaba,et al.  Dynamic Programming and Modern Control Theory , 1966 .

[7]  George N. Saridis,et al.  A parameter-adaptive control technique , 1969, Autom..

[8]  D. Sworder Feedback control of a class of linear systems with jump parameters , 1969 .

[9]  Takeshi Fukao,et al.  1971 kyow symposium paper: Optimal stochastic control for discrete-time linear system with interrupted observations , 1972 .

[10]  G. Saridis,et al.  Parameter identification and control of linear discrete-time systems , 1972 .

[11]  George N. Saridis,et al.  A learning approach to the parameter-adaptive self-organizing control problem , 1972 .

[12]  Y. Bar-Shalom,et al.  Wide-sense adaptive dual control for nonlinear stochastic systems , 1973 .

[13]  Åke Wernersson,et al.  Comments on: Optimal stochastic control for discrete-time linear systems with interrupted observations , 1974, Autom..

[14]  Raman K. Mehra,et al.  Optimal input signals for parameter estimation in dynamic systems--Survey and new results , 1974 .

[15]  S. Fujishige Remarks on "optimal stochastic control for discrete-time linear system with interrupted observations" , 1974, Autom..

[16]  R. Rishel Control of systems with jump Markov disturbances , 1975 .

[17]  D. Sworder,et al.  Feedback control of a class of linear discrete systems with jump parameters and quadratic cost criteria , 1975 .

[18]  Chelsea C. White,et al.  Finite-state, discrete-time optimization with randomly varying observation quality , 1976, Autom..

[19]  Robert B. Asher,et al.  Optimal open-loop feedback control for linear systems with unknown parameters , 1976, Inf. Sci..

[20]  Alan S. Willsky,et al.  A survey of design methods for failure detection in dynamic systems , 1976, Autom..

[21]  David Rappaport,et al.  Stochastic control of system with unobserved jump parameter process , 1977, Inf. Sci..

[22]  M. Athans,et al.  State Estimation for Discrete Systems with Switching Parameters , 1978, IEEE Transactions on Aerospace and Electronic Systems.

[23]  Jitendra Tugnait Detection and estimation for abruptly changing systems , 1981, CDC 1981.

[24]  Howard Jay Chizeck Fault Tolerant Optimal Control. , 1982 .

[25]  O. Hijab The adaptive LQG problem--Part I , 1983 .

[26]  Michael Athans,et al.  On Reliable Control System Designs , 1986, IEEE Transactions on Systems, Man, and Cybernetics.