NEVE++: A neuro-evolutionary unlimited ensemble for adaptive learning

In our previous works [1, 2], we proposed NEVE, a model that uses a weighted ensemble of neural network classifiers for adaptive learning, trained by means of a quantum-inspired evolutionary algorithm (QIEA). We showed that the neuro-evolutionary classifiers were able to learn the dataset and to quickly respond to any drifts on the underlying data. Now, we are particularly interested on analyzing the influence of an unlimited ensemble, instead of the limited ensemble from NEVE. For that, we modified NEVE to work with unlimited ensembles, and we call this new algorithm NEVE++. To verity how the unlimited ensemble influences the results, we used four different datasets with concept drift in order to compare the accuracy of NEVE and NEVE++, using two other existing algorithms as reference.

[1]  Thomas D. Sandry,et al.  Introductory Statistics With R , 2003, Technometrics.

[2]  William Nick Street,et al.  A streaming ensemble algorithm (SEA) for large-scale classification , 2001, KDD '01.

[3]  R Core Team,et al.  R: A language and environment for statistical computing. , 2014 .

[4]  Marley M. B. R. Vellasco,et al.  Using ensembles for adaptive learning: A comparative approach , 2013, The 2013 International Joint Conference on Neural Networks (IJCNN).

[5]  Ludmila I. Kuncheva,et al.  Classifier Ensembles for Changing Environments , 2004, Multiple Classifier Systems.

[6]  Geoff Hulten,et al.  Mining time-changing data streams , 2001, KDD '01.

[7]  Xin Yao,et al.  The Impact of Diversity on Online Ensemble Learning in the Presence of Concept Drift , 2010, IEEE Transactions on Knowledge and Data Engineering.

[8]  Jong-Hwan Kim,et al.  Quantum-Inspired Evolutionary Algorithms With a New Termination Criterion , H Gate , and Two-Phase Scheme , 2009 .

[9]  Jong-Hwan Kim,et al.  On Setting the Parameters of QEA for Practical Applications: Some Guidelines Based on Empirical Evidence , 2003, GECCO.

[10]  Robi Polikar,et al.  Incremental learning in nonstationary environments with controlled forgetting , 2009, 2009 International Joint Conference on Neural Networks.

[11]  Alexey Tsymbal,et al.  The problem of concept drift: definitions and related work , 2004 .

[12]  Mykola Pechenizkiy,et al.  Dynamic integration of classifiers for handling concept drift , 2008, Inf. Fusion.

[13]  Haibo He,et al.  Towards incremental learning of nonstationary imbalanced data stream: a multiple selectively recursive approach , 2011, Evol. Syst..

[14]  Jong-Hwan Kim,et al.  Quantum-inspired evolutionary algorithm for a class of combinatorial optimization , 2002, IEEE Trans. Evol. Comput..

[15]  Xin Yao,et al.  DDD: A New Ensemble Approach for Dealing with Concept Drift , 2012, IEEE Transactions on Knowledge and Data Engineering.

[16]  Marcus A. Maloof,et al.  Dynamic Weighted Majority: An Ensemble Method for Drifting Concepts , 2007, J. Mach. Learn. Res..

[17]  Robi Polikar,et al.  Incremental Learning of Concept Drift in Nonstationary Environments , 2011, IEEE Transactions on Neural Networks.

[18]  J. Royston An Extension of Shapiro and Wilk's W Test for Normality to Large Samples , 1982 .

[19]  Jong-Hwan Kim,et al.  Quantum-inspired evolutionary algorithms with a new termination criterion, H/sub /spl epsi// gate, and two-phase scheme , 2004, IEEE Transactions on Evolutionary Computation.

[20]  Konrad Jackowski,et al.  Fixed-size ensemble classifier system evolutionarily adapted to a recurring context with an unlimited pool of classifiers , 2013, Pattern Analysis and Applications.

[21]  Marley M. B. R. Vellasco,et al.  NEVE: A Neuro-Evolutionary Ensemble for Adaptive Learning , 2013, AIAI.

[22]  Francisco Herrera,et al.  A unifying view on dataset shift in classification , 2012, Pattern Recognit..

[23]  Marcus A. Maloof,et al.  Dynamic weighted majority: a new ensemble method for tracking concept drift , 2003, Third IEEE International Conference on Data Mining.

[24]  Nikunj C. Oza,et al.  Online Ensemble Learning , 2000, AAAI/IAAI.

[25]  J. C. Schlimmer,et al.  Incremental learning from noisy data , 2004, Machine Learning.

[26]  Robi Polikar,et al.  Learning concept drift in nonstationary environments using an ensemble of classifiers based approach , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).