A Game-Theoretic Approach to Control and Stabilization of Systems with Disturbances

In this paper, a control problem and a stabilization problem for linear plants with disturbances are considered by using a game-theoretic approach. A class of linear differential games with fixed terminal time is solved by an elementary method. This game may be regarded as an extension of the well-known regulator problem in control theory. Analogous to the regulator problem, the optimal strategies are given by linear feedback control laws whose gain matrices are obtained by solving a matrix Riccati equation. Unlike the regulator problem, however, this equation is not necessarily solvable. It is shown that this equation is solvable if and only if the solution of a certain linear matrix differential equation is nonsingular. Using these results it is shown that, under a certain condition, there exists a linear feedback control law which brings the state of the system to an arbitrary neighbourhood of the origin whatever a square-integrable disturbance the system undergoes. This result suggests the possibility of extending the concept of controllability defined by R.E. Kalman for linear dynamical systems to systems under conflict or systems with disturbances. It is also shown that, if the value of the disturbance is limited by the norm of the current state vector, there exists a linear feedback control law which makes the system asymptotically stable.