Existence of Optimal Controls for Stochastic Jump Processes

Sufficient conditions are given for existence of an optimal control policy for a class of controlled jump processes. The processes are specified by a family of “local descriptions” depending on a control which is a function of the complete past of the process. Conditions for optimality were given in a previous paper [M. H. A. Davis and R. J. Elliott, Optimal control of a jump process, Z. Wahrscheinlichkeitstheorie and Verw. Gebiete, 40 (1977), pp. 183–202]. Here it is shown that, under fairly stringent conditions on the form of the local descriptions, an optimal policy can be constructed as long as a certain “Hamiltonian” function can be minimized.