On an improved model reduction technique for nonlinear systems

An algorithm is presented for reducing the number of terms in a nonlinear static system which can be modeled by a linear combination of nonlinear functions. The method is an improvement over a previously presented algorithm (Desrochers and Saridis, 1980a). The improvements now make it possible to perform all calculations from a single set of input and output data while in the original algorithm n sets of data were required where n is the number of terms retained. In addition, it is shown how the model error can be calculated at each iteration which relieves the arbitrariness of stopping the algorithm at a preselected value of n as was done originally. Then the insight gained from this improved technique is used to develop an optimal solution to the model reduction problem, a major improvement over the original technique. It is then conjectured that some structural concepts for such systems may exist in a matrix formed from the input and output data.