On the Bellman Equation for Infinite Horizon Problems with Unbounded Cost Functional

Abstract. We study a class of infinite horizon control problems for nonlinear systems, which includes the Linear Quadratic (LQ) problem, using the Dynamic Programming approach. Sufficient conditions for the regularity of the value function are given. The value function is compared with sub- and supersolutions of the Bellman equation and a uniqueness theorem is proved for this equation among locally Lipschitz functions bounded below. As an application it is shown that an optimal control for the LQ problem is nearly optimal for a large class of small unbounded nonlinear and nonquadratic pertubations of the same problem.