Differential dynamic programming

where u(t) is a vector of control variables. Find that function u on [0, T] which minimises F[x(T)~\. This is equivalent to the Mayer problem in the calculus of variations. With appropriate modifications it represents the control of a rocket trajectory, control of a batch chemical reactor, and some other engineering problems. The chief remaining interest in the problem, for engineering applications, lies in finding efficient computational methods. These have to be iterative numerical methods, and the most favoured start with a non-optimal u and successively improve it by hill-climbing methods. Jaoobson and Mayne present new algorithms for this purpose, which they derive from Bellman's dynamic programming. The essential novelty lies in using the idea of strong variations. Greater computational efficiency is claimed, and is supported by a number of computed examples. The generalisations which are treated include control constraints, discretetime systems, and stochastic systems. Emphasis in the book is on numerical aspects, and it does not give a broad account of optimal control. For this reason its appeal will be to specialists rather than to the majority of students of control.