Learning Subspace-Based RBFNN Using Coevolutionary Algorithm for Complex Classification Tasks

Many real-world classification problems are characterized by samples of a complex distribution in the input space. The classification accuracy is determined by intrinsic properties of all samples in subspaces of features. This paper proposes a novel algorithm for the construction of radial basis function neural network (RBFNN) classifier based on subspace learning. In this paper, feature subspaces are obtained for every hidden node of the RBFNN during the learning process. The connection weights between the input layer and the hidden layer are adjusted to produce various subspaces with dominative features for different hidden nodes. The network structure and dominative features are encoded in two subpopulations that are cooperatively coevolved using the coevolutionary algorithm to achieve a better global optimality for the estimated RBFNN. Experimental results illustrate that the proposed algorithm is able to obtain RBFNN models with both better classification accuracy and simpler network structure when compared with other learning algorithms. Thus, the proposed model provides a more flexible and efficient approach to complex classification tasks by employing the local characteristics of samples in subspaces.

[1]  Minqiang Li,et al.  Dual-population based coevolutionary algorithm for designing RBFNN with feature selection , 2010, Expert Syst. Appl..

[2]  Sundaram Suresh,et al.  Sequential Projection-Based Metacognitive Learning in a Radial Basis Function Network for Classification Problems , 2013, IEEE Transactions on Neural Networks and Learning Systems.

[3]  Jiong Yang,et al.  Mining High-Dimensional Data , 2010, Data Mining and Knowledge Discovery Handbook.

[4]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[5]  Hiroyuki Yoshida,et al.  Principal Composite Kernel Feature Analysis: Data-Dependent Kernel Approach , 2013, IEEE Transactions on Knowledge and Data Engineering.

[6]  Jose Miguel Puerta,et al.  Speeding up incremental wrapper feature subset selection with Naive Bayes classifier , 2014, Knowl. Based Syst..

[7]  Jian-Bo Yang,et al.  Feature Selection for MLP Neural Network: The Use of Random Permutation of Probabilistic Outputs , 2009, IEEE Transactions on Neural Networks.

[8]  Erkki Oja,et al.  Subspace Dimension Selection and Averaged Learning Subspace Method in Handwritten Digit Classification , 1996, ICANN.

[9]  Stefanos Zafeiriou,et al.  Efficient Online Subspace Learning With an Indefinite Kernel for Visual Tracking and Recognition , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[10]  Sébastien Marcel,et al.  A Scalable Formulation of Probabilistic Linear Discriminant Analysis: Applied to Face Recognition , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Jing Hua,et al.  Localized feature selection for clustering , 2008, Pattern Recognit. Lett..

[13]  Xin Yao,et al.  Ensemble Learning Using Multi-Objective Evolutionary Algorithms , 2006, J. Math. Model. Algorithms.

[14]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[15]  Geoffrey I. Webb,et al.  Feature-subspace aggregating: ensembles for stable and unstable learners , 2011, Machine Learning.

[16]  Nir Friedman,et al.  Bayesian Network Classifiers , 1997, Machine Learning.

[17]  Nicolás García-Pedrajas,et al.  Boosting random subspace method , 2008, Neural Networks.

[18]  Hiroshi Sako,et al.  Class-specific feature polynomial classifier for pattern classification and its application to handwritten numeral recognition , 2006, Pattern Recognit..

[19]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[20]  Yunming Ye,et al.  Stratified sampling for feature subspace selection in random forests for high dimensional data , 2013, Pattern Recognit..

[21]  Pavel Pudil,et al.  Introduction to Statistical Pattern Recognition , 2006 .

[22]  Jieping Ye,et al.  A shared-subspace learning framework for multi-label classification , 2010, TKDD.

[23]  Zhao Wei-xiang RBFN Structure Determination Strategy Based on PLS and GAs , 2002 .

[24]  Zhaohong Deng,et al.  Enhanced soft subspace clustering integrating within-cluster and between-cluster information , 2010, Pattern Recognit..

[25]  Jian Yang,et al.  Modified Principal Component Analysis: An Integration of Multiple Similarity Subspace Models , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[26]  Feng Chu,et al.  A General Wrapper Approach to Selection of Class-Dependent Features , 2008, IEEE Transactions on Neural Networks.

[27]  David Casasent,et al.  Radial basis function neural networks for nonlinear Fisher discrimination and Neyman-Pearson classification , 2003, Neural Networks.

[28]  Lipo Wang,et al.  A GA-based RBF classifier with class-dependent features , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[29]  Eric C. Rouchka,et al.  Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training , 2011, IEEE Transactions on Neural Networks.

[30]  Daniel S. Yeung,et al.  Feature selection using localized generalization error for supervised classification problems using RBFNN , 2008, Pattern Recognit..

[31]  Yanxi Liu,et al.  SVM decision boundary based discriminative subspace induction , 2005, Pattern Recognit..

[32]  Sergio Escalera,et al.  A genetic-based subspace analysis method for improving Error-Correcting Output Coding , 2013, Pattern Recognit..

[33]  Gholam Ali Montazer,et al.  Improvement of RBF neural networks using Fuzzy-OSD algorithm in an online radar pulse classification system , 2013, Appl. Soft Comput..

[34]  Kenneth A. De Jong,et al.  Cooperative Coevolution: An Architecture for Evolving Coadapted Subcomponents , 2000, Evolutionary Computation.

[35]  Peter W. Eklund Comparative study of public-domain supervised machine-learning accuracy on the UCI database , 1999, Defense, Security, and Sensing.

[36]  C. C. H. Borges,et al.  Obtaining a restricted Pareto front in evolutionary multiobjective optimization , 2001 .

[37]  Haisheng Li,et al.  Random subspace evidence classifier , 2013, Neurocomputing.

[38]  Hao Yu,et al.  Fast and Efficient Second-Order Method for Training Radial Basis Function Networks , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[39]  Minqiang Li,et al.  Coevolutionary learning of neural network ensemble for complex classification tasks , 2012, Pattern Recognit..

[40]  Yoshiki Yamagata,et al.  Improved subspace classification method for multispectral remote sensing image classification. , 2010 .

[41]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[42]  S. Sathiya Keerthi,et al.  Improvements to Platt's SMO Algorithm for SVM Classifier Design , 2001, Neural Computation.

[43]  Jacob Scharcanski,et al.  Feature selection for face recognition based on multi-objective evolutionary wrappers , 2013, Expert Syst. Appl..