Two-pass orthogonal least-squares algorithm to train and reduce fuzzy logic systems
暂无分享,去创建一个
Fuzzy logic systems (FLSs) can be designed using training data (i.e. given M numerical input/output pairs) and supervised learning algorithms. Orthogonal least-squares (OLS) learning decomposes a FLS into a linear combination of M/sub s/<M nonlinear fuzzy basis functions (FBFs), which are optimized during OLS to match the training data. The drawback to OLS is that the resulting system still contains information from all M initial rules, derived from the training points, even though only the most important M/sub s/ rules have been established by OLS. This is due to a normalization of the FBFs, and leads to excessive computation times during further processing. Our solution is to construct new FBFs out of the reduced rule-base and to run OLS a second time. The resulting system not only is of reduced computational complexity, but is of very similar behaviour to the unreduced system. The second run of OLS can be applied to a larger set of training data which greatly improves the precision. We illustrate our two-pass OLS algorithm for prediction of the Mackey-Glass chaotic time series.<<ETX>>
[1] Å. Björck. Solving linear least squares problems by Gram-Schmidt orthogonalization , 1967 .
[2] Shang-Liang Chen,et al. Orthogonal least squares learning algorithm for radial basis function networks , 1991, IEEE Trans. Neural Networks.
[3] L X Wang,et al. Fuzzy basis functions, universal approximation, and orthogonal least-squares learning , 1992, IEEE Trans. Neural Networks.