Minimum Effort Control Systems

An optimal control problem is considered in which it is desired to transfer a linear control system from one given state to another state. The target state may either be a point or a convex closed set. Optimization is understood in the sense of minimizing the control effort, where effort is defined either as maximum amplitude or as an integral of a certain function of the control. The optimization problem is reduced to the problem of finding the unique minimum of a function of n variables (where n is the order of the system) . It is shown that the method of steepest descent is particularly applicable to finding this minimum, and consequently to determining the minimum effort and optimal control.