Infinite Horizon LQ Optimal Control for Discrete-Time Stochastic Systems

This paper is concerned with the infinite horizon linear quadratic (LQ) optimal control for discrete-time stochastic systems with both state and control-dependent noise. Under assumptions of stabilization and exact observability, it is shown that the optimal control law and optimal value exist, and the properties of the associated discrete algebraic Riccati equation (ARE) are also discussed