Optimal control of piecewise deterministic Markov processes.

This thesis describes a complete theory of optimal control of piecewise deterministic Markov processes under weak assumptions. The theory consists of a description of the processes, a nonsmooth stochastic maximum principle as a necessary optimality condition, a generalized Bellman-Hamilton-Jacobi necessary and sufficient optimality condition involving the Clarke generalized gradient, existence results and regularity properties of the value function. The impulse control problem is transformed to an equivalent optimal dynamic control problem. Cost functions are subject only to growth conditions.