An ensemble learning framework for convolutional neural network based on multiple classifiers

Traditional machine learning methods have certain limitations in constructing high-precision estimation models and improving generalization ability, but ensemble learning that combines multiple different single models into one model is significantly better than that obtained by a single machine learning model. When the types of data sets are diversified and the scale is increasing, the ensemble learning algorithm has the problem of incomplete representation of features. At this time, convolutional neural network (CNN) with excellent feature learning ability makes up for the shortcomings of ensemble learning. In this paper, an ensemble learning framework for convolutional neural network based on multiple classifiers is proposed. First, this method mainly classifies UCI data sets using the ensemble learning algorithms based on multiple classifiers. Then, feature extraction is performed on the image data set MNIST using a convolutional neural network, and the extracted features are applied as input to be classified using an ensemble learning framework. The experimental results show that the accuracy of ensemble learning is higher than the accuracy of a single classifier and the accuracy of CNN + ensemble learning framework is higher than the accuracy of ensemble learning framework.

[1]  Xiaoyuan Jing,et al.  Multiple kernel ensemble learning for software defect prediction , 2015, Automated Software Engineering.

[2]  Ahmad Lotfi,et al.  Human activity learning for assistive robotics using a classifier ensemble , 2018, Soft Comput..

[3]  Meng Lu,et al.  Fully Convolutional Networks for Multisource Building Extraction From an Open Aerial and Satellite Imagery Data Set , 2019, IEEE Transactions on Geoscience and Remote Sensing.

[4]  Tara N. Sainath,et al.  Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups , 2012, IEEE Signal Processing Magazine.

[5]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.

[6]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[7]  Jun Zhang,et al.  Hybrid Incremental Ensemble Learning for Noisy Real-World Data Classification , 2019, IEEE Transactions on Cybernetics.

[8]  Tianqi Chen,et al.  XGBoost: A Scalable Tree Boosting System , 2016, KDD.

[9]  Ioannis A. Kakadiaris,et al.  Hierarchical Multi-label Classification using Fully Associative Ensemble Learning , 2017, Pattern Recognit..

[10]  B.V. Dasarathy,et al.  A composite classifier system design: Concepts and methodology , 1979, Proceedings of the IEEE.

[11]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[12]  Heiga Zen,et al.  WaveNet: A Generative Model for Raw Audio , 2016, SSW.

[13]  Leo Breiman,et al.  Classification and Regression Trees , 1984 .

[14]  Ian Dennis Longstaff,et al.  A pattern recognition approach to understanding the multi-layer perception , 1987, Pattern Recognit. Lett..

[15]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[16]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[17]  F ROSENBLATT,et al.  The perceptron: a probabilistic model for information storage and organization in the brain. , 1958, Psychological review.

[18]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[19]  Wen Gao,et al.  Speech Emotion Recognition Using Deep Convolutional Neural Network and Discriminant Temporal Pyramid Matching , 2018, IEEE Transactions on Multimedia.

[20]  Robert E. Schapire,et al.  The Strength of Weak Learnability (Extended Abstract) , 1989, FOCS 1989.

[21]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[22]  Kaiming He,et al.  Mask R-CNN , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[23]  Yun Zhou,et al.  An ensemble learning approach for XSS attack detection with domain knowledge and threat intelligence , 2019, Comput. Secur..

[24]  David D. Lewis,et al.  Naive (Bayes) at Forty: The Independence Assumption in Information Retrieval , 1998, ECML.

[25]  Fan Yang,et al.  Streaming data anomaly detection method based on hyper-grid structure and online ensemble learning , 2017, Soft Comput..