Boosting of Neural Networks over MNIST Data

The methods proposed in the article come out from a technique called boosting, which is based on the principle of combining a large number of so-called weak classifiers into a strong classifier. The article is focused on the possibility of increasing the efficiency of the algorithms via their appropriate combination, and particularly increasing their reliability and reducing their time exigency. Time exigency does not mean time exigency of the algorithm itself, nor its development, but time exigency of applying the algorithm to a particular problem domain. Simulations and experiments of the proposed processes were performed in the designed and created application environment. Experiments have been conducted over the MNIST database of handwritten digits that is commonly used for training and testing in the field of machine learning. Finally, a comparative experimental study with other approaches is presented. All achieved results are summarized in a conclusion.

[1]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[2]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, CACM.

[3]  J. Ross Quinlan,et al.  Bagging, Boosting, and C4.5 , 1996, AAAI/IAAI, Vol. 1.

[4]  Yoav Freund,et al.  A Short Introduction to Boosting , 1999 .

[5]  Laurene V. Fausett,et al.  Fundamentals Of Neural Networks , 1993 .

[6]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[7]  Robert E. Schapire,et al.  The strength of weak learnability , 1990, Mach. Learn..

[8]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[9]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[10]  Robert E. Schapire,et al.  A Brief Introduction to Boosting , 1999, IJCAI.

[11]  Ian Davidson,et al.  When Efficient Model Averaging Out-Performs Boosting and Bagging , 2006, PKDD.

[12]  Yoav Freund,et al.  Boosting a weak learning algorithm by majority , 1990, COLT '90.

[13]  Thomas G. Dietterich An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization , 2000, Machine Learning.

[14]  Harris Drucker,et al.  Boosting Performance in Neural Networks , 1993, Int. J. Pattern Recognit. Artif. Intell..

[15]  Eva Volná,et al.  ENSEMBLES OF NEURAL-NETWORKS-BASED CLASSIFIERS , 2012 .

[16]  Kazuo Asakawa,et al.  An AdaBoost Using a Weak-Learner Generating Several Weak-Hypotheses for Large Training Data of Natural Language Processing , 2010 .