Design of Fuzzy Ensemble Architecture Realized With the Aid of FCM-Based Fuzzy Partition and NN With Weighted LSE Estimation

Neural Networks (NNs) with Least Square Error (LSE) estimation form a certain type single hidden layer feed-forward neural networks. In this class of networks, the input connections (weights) and the biases of hidden neurons are generated randomly and fixed after being generated. The output connections are estimated by the LSE method rather than the back-propagation method. The random generation of the input connection weights and the hidden biases results in the larger number of hidden neurons to assure the quality of classification performance. To reduce the number of neurons in the hidden layer while maintaining the classification performance, we apply a “divide and conquer” strategy. In other words, we divide an overall input space into several sub-spaces by using information granulation technique (Fuzzy C-Means clustering algorithm) and determine the local decision boundaries among related sub-spaces. A decision boundary defined in the input space can be considered as being composed of several decision boundaries defined in sub-spaces which form the entire input space. For the decision boundaries defined in the sub-spaces, their nonlinearity becomes lower in comparison with the one being encountered when considering the entire input space. Through the weighted least square error estimation instead of using the least square error estimation method, the connections of several neural networks can be estimated without interfering with each other.

[1]  Hisao Ishibuchi,et al.  Deep Takagi–Sugeno–Kang Fuzzy Classifier With Shared Linguistic Fuzzy Rules , 2018, IEEE Transactions on Fuzzy Systems.

[2]  Hongming Zhou,et al.  Extreme Learning Machine for Regression and Multiclass Classification , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[3]  David S. Broomhead,et al.  Multivariable Functional Interpolation and Adaptive Networks , 1988, Complex Syst..

[4]  Jing Lu,et al.  Creating ensembles of classifiers via fuzzy clustering and deflection , 2010, Fuzzy Sets Syst..

[5]  Yong Peng,et al.  An unsupervised discriminative extreme learning machine and its applications to data clustering , 2016, Neurocomputing.

[6]  J. Berger Minimax estimation of a multivariate normal mean under arbitrary quadratic loss , 1976 .

[7]  Stephen L. Chiu,et al.  Fuzzy Model Identification Based on Cluster Estimation , 1994, J. Intell. Fuzzy Syst..

[8]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[9]  Jesús Alcalá-Fdez,et al.  KEEL Data-Mining Software Tool: Data Set Repository, Integration of Algorithms and Experimental Analysis Framework , 2011, J. Multiple Valued Log. Soft Comput..

[10]  Witold Pedrycz,et al.  Fuzzy multimodels , 1996, IEEE Trans. Fuzzy Syst..

[11]  Ludmila I. Kuncheva,et al.  Clustering-and-selection model for classifier combination , 2000, KES'2000. Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. Proceedings (Cat. No.00TH8516).

[12]  Haralambos Sarimveis,et al.  A hierarchical fuzzy-clustering approach to fuzzy modeling , 2005, Fuzzy Sets Syst..

[13]  Jason Weston,et al.  Fast Kernel Classifiers with Online and Active Learning , 2005, J. Mach. Learn. Res..

[14]  Gholamali Heydari,et al.  New Formulation for Representing Higher Order TSK Fuzzy Systems , 2016, IEEE transactions on fuzzy systems.

[15]  Meng Joo Er,et al.  Generalized Single-Hidden Layer Feedforward Networks for Regression Problems , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[16]  Witold Pedrycz,et al.  Enhancement of the classification and reconstruction performance of fuzzy C-means with refinements of prototypes , 2017, Fuzzy Sets Syst..

[17]  Wensheng Zhang,et al.  Generalization Performance of Radial Basis Function Networks , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[18]  Jyh-Shing Roger Jang,et al.  ANFIS: adaptive-network-based fuzzy inference system , 1993, IEEE Trans. Syst. Man Cybern..

[19]  Nikhil R. Pal,et al.  A Multiobjective Genetic Programming-Based Ensemble for Simultaneous Feature Selection and Classification , 2016, IEEE Transactions on Cybernetics.

[20]  E. Mizutani,et al.  Neuro-Fuzzy and Soft Computing-A Computational Approach to Learning and Machine Intelligence [Book Review] , 1997, IEEE Transactions on Automatic Control.

[21]  Yong Dou,et al.  PR-ELM: Parallel regularized extreme learning machine based on cluster , 2016, Neurocomputing.

[22]  Annalisa Riccardi,et al.  Ordinal Neural Networks Without Iterative Tuning , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[23]  Oleksandr Makeyev,et al.  Neural network with ensembles , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[24]  Dianhui Wang,et al.  Fast decorrelated neural network ensembles with random weights , 2014, Inf. Sci..

[26]  Md. Mustafizur Rahman,et al.  Layered Ensemble Architecture for Time Series Forecasting , 2016, IEEE Transactions on Cybernetics.

[27]  Witold Pedrycz,et al.  Constructing a Virtual Space for Enhancing the Classification Performance of Fuzzy Clustering , 2019, IEEE Transactions on Fuzzy Systems.

[28]  John Yen,et al.  Improving the interpretability of TSK fuzzy models by combining global learning and local learning , 1998, IEEE Trans. Fuzzy Syst..

[29]  อนิรุธ สืบสิงห์,et al.  Data Mining Practical Machine Learning Tools and Techniques , 2014 .

[30]  Tsau Young Lin,et al.  Granular Computing , 2003, RSFDGrC.

[31]  Moncef Gabbouj,et al.  Training Radial Basis Function Neural Networks for Classification via Class-Specific Clustering , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[32]  Zhaohong Deng,et al.  Transfer Prototype-Based Fuzzy Clustering , 2014, IEEE Transactions on Fuzzy Systems.

[33]  Noureddine Zerhouni,et al.  SW-ELM: A summation wavelet extreme learning machine algorithm with a priori parameter initialization , 2014, Neurocomputing.