Numerical solution of optimal control problem

A new method for computing the optimal control, i.e., minimizing a given performance index, thereby steering the given initial state of a system to some point in the state space which may or may not be specified, is proposed here. The classical Calculus of Variations approach and the modern approach of variation in control that leads to Pontriagin’s principle have been first compared. Moreover, based on the ideas in the derivation of the Maximum Principle of Pontryagin, we have derived the maximum principle of Pontryagin using the approach of Classical Calculus of Variations modified along the line of brief perturbation and derived an expression for the change in the value of performance index over each discretized interval. First, we derived an expression for the difference in the performance index due to perturbation in control $$\mathbf{u}$$ that differs slightly from $${\boldsymbol{u}}^{\boldsymbol{*}}$$ . Then, using the calculus of variations once again, we derived the formula for calculating the difference in the performance index using the brief perturbation suggested by Agashe. Also, we have given three examples of linear and non-linear systems for establishing the expediency of our derivations. Two examples are on the calculus of variations using the perturbation approach and one based on the derivation of Pontryagin’s principle.