Optimal control of systems with distributed parameters

Summary Plants with distributed parameters are widespread in many branches of engineering. The problem of designing control systems which are the best in a given sense for such plants requires the development of new techniques of optimal control. Control systems with distributed parameters have a number of specific properties that make investigation into them very complex compared with lumped-constant systems. This paper gives a statement of optimal control problems on systems with distributed parameters for a fairly broad class of systems described by non-linear integral equations under arbitrary constraints. It formulates a maximum principle for this case, which gives the necessary conditions for determining optimal controls. The paper also considers approximate methods of solving optimal control problems on systems with distributed parameters, based on approximating the equations for the distribution functions by ordinary differential equations.