In this paper, a control problem and a stabilization problem for linear plants with disturbances are considered by using a game-theoretic approach. A class of linear differential games with fixed terminal time is solved by an elementary method. This game may be regarded as an extension of the well-known regulator problem in control theory. Analogous to the regulator problem, the optimal strategies are given by linear feedback control laws whose gain matrices are obtained by solving a matrix Riccati equation. Unlike the regulator problem, however, this equation is not necessarily solvable. It is shown that this equation is solvable if and only if the solution of a certain linear matrix differential equation is nonsingular. Using these results it is shown that, under a certain condition, there exists a linear feedback control law which brings the state of the system to an arbitrary neighbourhood of the origin whatever a square-integrable disturbance the system undergoes. This result suggests the possibility of extending the concept of controllability defined by R.E. Kalman for linear dynamical systems to systems under conflict or systems with disturbances. It is also shown that, if the value of the disturbance is limited by the norm of the current state vector, there exists a linear feedback control law which makes the system asymptotically stable.
[1]
H. Kimura.
Linear differential games with terminal payoff
,
1970
.
[2]
D. Luenberger,et al.
Differential games with imperfect state information
,
1969
.
[3]
Y. Ho,et al.
Differential games and optimal pursuit-evasion strategies
,
1965
.
[4]
H. Antosiewicz,et al.
Differential Equations: Stability, Oscillations, Time Lags
,
1967
.
[5]
Yoshiyuki Sakawa,et al.
Solution of Linear Pursuit-Evasion Games
,
1970
.
[6]
R. E. Kalman,et al.
Contributions to the Theory of Optimal Control
,
1960
.
[7]
Y. Ho,et al.
On a class of linear stochastic differential games
,
1968
.