A discrete method of optimal control based upon the cell state space concept

A discrete method of optimal control is proposed in this paper. The continuum state space of a system is discretized into a cell state space, and the cost function is discretized in a similar manner. Assuming intervalwise constant controls and using a finite set of admissible control levels (u) and a finite set of admissible time intervals (τ), the motion of the system under all possible interval controls (u, τ) can then be expressed in terms of a family of cell-to-cell mappings. The proposed method extracts the optimal control results from these mappings by a systematic search, culminating in the construction of a discrete optimal control table.The possibility of expressing the optimal control results in the form of a control table seems to give this method a means to make systems real-time controllable.