An Optimized Classification Algorithm by Neural Network Ensemble Based on PLS and OLS

Using the neural network to classify the data which has higher dimension and fewer samples means overmuch feature inputs influence the structure design of neural network and fewer samples will generate incomplete or overfitting phenomenon during the neural network training. All of the above will restrict the recognition precision obviously. It is even better to use neural network to classify and, therefore, propose a neural network ensemble optimized classification algorithm based on PLS and OLS in this paper. The new algorithm takes some advantages of partial least squares (PLS) algorithm to reduce the feature dimension of small sample data, which obtains the low-dimensional and stronger illustrative data; using ordinary least squares (OLS) theory determines the weights of each neural network in ensemble learning system. Feature dimension reduction is applied to simplify the neural network’s structure and improve the operation efficiency; ensemble learning can compensate for the information loss caused by the dimension reduction; on the other hand, it improves the recognition precision of classification system. Finally, through the case analysis, the experiment results suggest that the operating efficiency and recognition precision of new algorithm are greatly improved, which is worthy of further promotion.

[1]  Bohumír Garlík,et al.  Identification of type daily diagrams of electric consumption based on cluster analysis of multi-dimensional data by neural network , 2013 .

[2]  Chun-Xia Zhang,et al.  A Survey of Selective Ensemble Learning Algorithms: A Survey of Selective Ensemble Learning Algorithms , 2011 .

[3]  E. Dam,et al.  Texture Analysis by a PLS Based Method for Combined Feature Extraction and Selection , 2011, MLMI.

[4]  Mingzhe Liu,et al.  A PLS-Based Weighted Artificial Neural Network Approach for Alpha Radioactivity Prediction inside Contaminated Pipes , 2014 .

[5]  Zhongzhi Shi,et al.  Neural Network Research Progress and Applications in Forecast , 2008, ISNN.

[6]  Wei Tang,et al.  Corrigendum to "Ensembling neural networks: Many could be better than all" [Artificial Intelligence 137 (1-2) (2002) 239-263] , 2010, Artif. Intell..

[7]  Nikolaos Kourentzes,et al.  Neural network ensemble operators for time series forecasting , 2014, Expert Syst. Appl..

[8]  Su Chun-yang Elman Neural Network Algorithm Based on PLS , 2010 .

[9]  Wei Tang,et al.  Ensembling neural networks: Many could be better than all , 2002, Artif. Intell..

[10]  Dianhui Wang,et al.  Fast decorrelated neural network ensembles with random weights , 2014, Inf. Sci..

[11]  Guo-Zheng Li,et al.  Model selection for partial least squares based dimension reduction , 2012, Pattern Recognit. Lett..

[12]  Zhou Zhi,et al.  Neural Network Ensemble , 2002 .

[13]  D. Opitz,et al.  Popular Ensemble Methods: An Empirical Study , 1999, J. Artif. Intell. Res..

[14]  Simon X. Yang,et al.  Pork farm odour modelling using multiple-component multiple-factor analysis and neural networks , 2005, Appl. Soft Comput..

[15]  Hong Zhu,et al.  A survey on feature extraction for pattern recognition , 2011, Artificial Intelligence Review.

[16]  Liu Shu,et al.  Dimensionality Reduction in Statistical Pattern Recognition and Low Loss Dimensionality Reduction , 2005 .

[17]  Zhang Chun,et al.  A Survey of Selective Ensemble Learning Algorithms , 2011 .

[18]  Jing Zhou,et al.  Fault detection and identification spanning multiple processes by integrating PCA with neural network , 2014, Appl. Soft Comput..

[19]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..