Model reduction by minimization of integral square error performance indices

Abstract A model reduction method based on the minimization of output response deviations is presented. In the proposed method, both the poles and zeros of the reduced model are considered to be free parameters and are obtained by minimizing the integral square error in impulse or step responses. When tested with a random input, the reduced models derived from impulse response deviations outperform those designed to minimize step response deviations. Comparison to existing methods shows that the proposed procedure yields better reduced models.