Algorithm learning based neural network integrating feature selection and classification

Feature selection and classification techniques have been studied independently without considering the interaction between both procedures, which leads to a degraded performance. In this paper, we present a new neural network approach, which is called an algorithm learning based neural network (ALBNN), to improve classification accuracy by integrating feature selection and classification procedures. In general, a knowledge-based artificial neural network operates on prior knowledge from domain experience, which provides it with better starting points for the target function and leads to better classification accuracy. However, prior knowledge is usually difficult to identify. Instead of using unknown background resources, the proposed method utilizes prior knowledge that is mathematically calculated from the properties of other learning algorithms such as PCA, LARS, C4.5, and SVM. We employ the extreme learning machine in this study to help obtain better initial points faster and avoid irrelevant time-consuming work, such as determining architecture and manual tuning. ALBNN correctly approximates a target hypothesis by both considering the interaction between two procedures and minimizing individual procedure errors. The approach produces new relevant features and improves the classification accuracy. Experimental results exhibit improved performance in various classification problems. ALBNN can be applied to various fields requiring high classification accuracy.

[1]  M. Turk,et al.  Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.

[2]  Hongming Zhou,et al.  Extreme Learning Machine for Regression and Multiclass Classification , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[3]  Pedro Larrañaga,et al.  A review of feature selection techniques in bioinformatics , 2007, Bioinform..

[4]  Indra Neil Sarkar,et al.  Characteristic attributes in cancer microarrays , 2002, Journal of Biomedical Informatics.

[5]  Jin Young Choi,et al.  Theoretical analysis on feature extraction capability of class-augmented PCA , 2009, Pattern Recognit..

[6]  Jude W. Shavlik,et al.  Knowledge-Based Artificial Neural Networks , 1994, Artif. Intell..

[7]  Hyunsoo Yoon,et al.  Feature Selecting and Classifying Integrated Neural Network Algorithm for Multi-variate Classification , 2011 .

[8]  Mukta Paliwal,et al.  Neural networks and statistical techniques: A review of applications , 2009, Expert Syst. Appl..

[9]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[10]  Cagdas Hakan Aladag,et al.  A new architecture selection method based on tabu search for artificial neural networks , 2011, Expert Syst. Appl..

[11]  Tai-Yue Wang,et al.  Applying optimized BPN to a chaotic time series problem , 2007, Expert Syst. Appl..

[12]  Kazuyuki Murase,et al.  A new wrapper feature selection approach using neural network , 2010, Neurocomputing.

[13]  Josef Kittler,et al.  Floating search methods in feature selection , 1994, Pattern Recognit. Lett..

[14]  Christopher J. C. Burges,et al.  A Tutorial on Support Vector Machines for Pattern Recognition , 1998, Data Mining and Knowledge Discovery.

[15]  Seda Sahin,et al.  Hybrid expert systems: A survey of current approaches and applications , 2012, Expert Syst. Appl..

[16]  Wei-Yin Loh,et al.  A Comparison of Prediction Accuracy, Complexity, and Training Time of Thirty-Three Old and New Classification Algorithms , 2000, Machine Learning.

[17]  Yves Chauvin,et al.  Back-Propagation: Theory, Architecture, and Applications , 1995 .

[18]  Mohammad Bagher Menhaj,et al.  Training feedforward networks with the Marquardt algorithm , 1994, IEEE Trans. Neural Networks.

[19]  Çagdas Hakan Aladag,et al.  A new model selection strategy in artificial neural networks , 2008, Appl. Math. Comput..

[20]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[21]  Michael T. Manry,et al.  Attributes of neural networks for extracting continuous vegetation variables from optical and radar , 1998 .

[22]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.