Two-pass orthogonal least-squares algorithm to train and reduce fuzzy logic systems

Fuzzy logic systems (FLSs) can be designed using training data (i.e. given M numerical input/output pairs) and supervised learning algorithms. Orthogonal least-squares (OLS) learning decomposes a FLS into a linear combination of M/sub s/<M nonlinear fuzzy basis functions (FBFs), which are optimized during OLS to match the training data. The drawback to OLS is that the resulting system still contains information from all M initial rules, derived from the training points, even though only the most important M/sub s/ rules have been established by OLS. This is due to a normalization of the FBFs, and leads to excessive computation times during further processing. Our solution is to construct new FBFs out of the reduced rule-base and to run OLS a second time. The resulting system not only is of reduced computational complexity, but is of very similar behaviour to the unreduced system. The second run of OLS can be applied to a larger set of training data which greatly improves the precision. We illustrate our two-pass OLS algorithm for prediction of the Mackey-Glass chaotic time series.<<ETX>>