A feedback control system can be structured for linear nonstationary process and measurement systems comprising a deterministic filter whose output is the independent variable of a linear control law. Subject to uniform controllability and observability, the filter and control gains can be specified to provide arbitrary and separable stability properties. If the filter gain is selected to produce a stabilizing effect on the state estimate, and the control gain is selected to produce a stabilizing effect on the process, the filter and control gains are shown to satisfy matrix Riccati differential equations. This suggests the use of stochastic optimal control theory when there is no quantitative measure of optimality, but it is desirable to assure the qualitative property that feedback be stabilizing. A concise derivation of the Kalman-Bucy filter is included in an appendix to illustrate the facility of approaching optimal estimation problems with the methods of stability theory.
[1]
R. E. Kalman,et al.
Control System Analysis and Design Via the “Second Method” of Lyapunov: I—Continuous-Time Systems
,
1960
.
[2]
R. E. Kalman,et al.
New Results in Linear Filtering and Prediction Theory
,
1961
.
[3]
R. E. Kalman,et al.
When Is a Linear Control System Optimal
,
1964
.
[4]
D. Luenberger.
Observing the State of a Linear System
,
1964,
IEEE Transactions on Military Electronics.
[5]
An algebraic characterization of controllability
,
1965
.
[6]
A. Bryson,et al.
Linear filtering for time-varying systems using measurements containing colored noise
,
1965
.
[7]
D. Luenberger.
Observers for multivariable systems
,
1966
.
[8]
William A. Wolovich,et al.
On state estimation of observable systems
,
1968
.