Preface. Introduction. Part One: Stability of control systems. I. Continuous and discrete deterministic systems. II. Stability of stochastic systems. Part Two: Control of deterministic systems. III. Description of control problems. IV. The classical calculus of variations and optimal control. V. The maximum principle. VI. Linear control systems. VII. Dynamic programming approach. Sufficient conditions for optimal control. VIII. Some additional topics of optimal control theory. Part Three: Optimal control of dynamical systems under random disturbances. IX. Control of stochastic systems. Problem statements and investigation techniques. X. Optimal control on a time level of random duration. XI. Optimal estimation of the state of the system. XII. Optimal control of the observation process. Part Four: Numerical methods in control systems. XIII. Linear time-invariant control systems. XIV. Numerical methods for the investigation of nonlinear control systems. XV. Numerical design of optimal control systems. General references. Subject index.