Classification of students by using an incremental ensemble of classifiers

The amount of students data in the education system databases is growing day by day, so the knowledge taken out from these data need to be updated continuously. The training set of the supervised learning algorithms contains student's score in the test. Incremental learning ability is further significant for machine learning methodologies as student's data and the information is increasing. Against to the classical batch learning algorithm, incremental learning algorithm tries to forget unrelated information while training new instances. Now a days, combination of a classifiers is a novel concept for overall progress in the classification result. Therefore, an incremental ensemble of two classifiers namely Naïve Bayes, K-Star using majority voting scheme is proposed. The large scale comparison of a proposed ensemble technique by using different voting scheme with the state-of the art algorithm on the student's data set has been done. The experimental results shown high accuracy for the proposed ensemble for the student's classification. High accuracy was also achieved for the majority voting scheme as compared to other voting scheme.

[1]  Xuesong Yan,et al.  Survey of Improving Naive Bayes for Classification , 2007, ADMA.

[2]  Guido Dedene,et al.  A case study of applying boosting naive Bayes to claim fraud diagnosis , 2004, IEEE Transactions on Knowledge and Data Engineering.

[3]  B.V. Dasarathy,et al.  A composite classifier system design: Concepts and methodology , 1979, Proceedings of the IEEE.

[4]  John G. Cleary,et al.  K*: An Instance-based Learner Using and Entropic Distance Measure , 1995, ICML.

[5]  Sargur N. Srihari,et al.  Decision Combination in Multiple Classifier Systems , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Robi Polikar,et al.  Incremental learning in non-stationary environments with concept drift using a multiple classifier based approach , 2008, 2008 19th International Conference on Pattern Recognition.

[7]  Geoffrey E. Hinton,et al.  Adaptive Mixtures of Local Experts , 1991, Neural Computation.

[8]  Sotiris B. Kotsiantis,et al.  A combinational incremental ensemble of classifiers as a technique for predicting students' performance in distance education , 2010, Knowl. Based Syst..

[9]  Roshani Ade,et al.  Classification of Students Using psychometric tests with the help of Incremental Naive Bayes Algorithm , 2014 .

[10]  David H. Wolpert,et al.  Stacked generalization , 1992, Neural Networks.

[11]  Ryuei Nishii,et al.  Hyperspectral Image Classification by Bootstrap AdaBoost With Random Decision Stumps , 2007, IEEE Transactions on Geoscience and Remote Sensing.

[12]  Philip S. Yu,et al.  Classifying Data Streams with Skewed Class Distributions and Concept Drifts , 2008, IEEE Internet Computing.

[13]  อนิรุธ สืบสิงห์,et al.  Data Mining Practical Machine Learning Tools and Techniques , 2014 .

[14]  Katta G. Murty,et al.  On KΔ , 1986, Discret. Appl. Math..

[15]  Haibo He,et al.  IMORL: Incremental Multiple-Object Recognition and Localization , 2008, IEEE Transactions on Neural Networks.

[16]  Ludmila I. Kuncheva,et al.  Classifier Ensembles for Changing Environments , 2004, Multiple Classifier Systems.

[17]  Ching Y. Suen,et al.  Optimal combinations of pattern classifiers , 1995, Pattern Recognit. Lett..

[18]  Ralf Klinkenberg,et al.  Boosting classifiers for drifting concepts , 2007, Intell. Data Anal..

[19]  Adam Krzyżak,et al.  Methods of combining multiple classifiers and their applications to handwriting recognition , 1992, IEEE Trans. Syst. Man Cybern..

[20]  Sandra Zilles,et al.  Formal models of incremental learning and their analysis , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[21]  Christophe G. Giraud-Carrier,et al.  A Note on the Utility of Incremental Learning , 2000, AI Commun..

[22]  Galina L. Rogova,et al.  Combining the results of several neural network classifiers , 1994, Neural Networks.

[23]  Kevin W. Bowyer,et al.  Combination of Multiple Classifiers Using Local Accuracy Estimates , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[24]  Kevin W. Bowyer,et al.  Combination of multiple classifiers using local accuracy estimates , 1996, Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[25]  Ludmila I. Kuncheva,et al.  Classifier Ensembles for Detecting Concept Change in Streaming Data: Overview and Perspectives , 2008 .

[26]  Robi Polikar,et al.  Incremental Learning of Concept Drift in Nonstationary Environments , 2011, IEEE Transactions on Neural Networks.

[27]  Robi Polikar,et al.  Learn$^{++}$ .NC: Combining Ensemble of Classifiers With Dynamically Weighted Consult-and-Vote for Efficient Incremental Learning of New Classes , 2009, IEEE Transactions on Neural Networks.

[28]  Robi Polikar,et al.  Incremental learning in nonstationary environments with controlled forgetting , 2009, 2009 International Joint Conference on Neural Networks.

[29]  Robi Polikar,et al.  An Ensemble Approach for Incremental Learning in Nonstationary Environments , 2007, MCS.

[30]  Nikunj C. Oza,et al.  Online Ensemble Learning , 2000, AAAI/IAAI.