Integrating global and local application of random subspace ensemble

Many data analysis problems involve an investigation of relationships between attributes in heterogeneous databases, where different prediction models can be more appropriate for different regions. We propose a technique of integrating global and local random subspace ensemble. We performed a comparison with other well known combining methods on standard benchmark datasets and the proposed technique gave better accuracy.

[1]  Sotiris B. Kotsiantis,et al.  Local Boosting of Decision Stumps for Regression and Classification Problems , 2006, J. Comput..

[2]  อนิรุธ สืบสิงห์,et al.  Data Mining Practical Machine Learning Tools and Techniques , 2014 .

[3]  Tony R. Martinez,et al.  Reduction Techniques for Instance-Based Learning Algorithms , 2000, Machine Learning.

[4]  Pedro M. Domingos,et al.  On the Optimality of the Simple Bayesian Classifier under Zero-One Loss , 1997, Machine Learning.

[5]  Zhi Han,et al.  Feature combination using boosting , 2005, Pattern Recognit. Lett..

[6]  Bernhard Pfahringer,et al.  Locally Weighted Naive Bayes , 2002, UAI.

[7]  Geoffrey I. Webb,et al.  Feature-subspace aggregating: ensembles for stable and unstable learners , 2011, Machine Learning.

[8]  Eric Bauer,et al.  An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants , 1999, Machine Learning.

[9]  Nicolás García-Pedrajas,et al.  Boosting random subspace method , 2008, Neural Networks.

[10]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[11]  K. Lebart,et al.  Boosting Feature Selection , 2005, ICAPR.

[12]  Andrew W. Moore,et al.  Locally Weighted Learning , 1997, Artificial Intelligence Review.

[13]  Juan José Rodríguez Diez,et al.  Rotation Forest: A New Classifier Ensemble Method , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Geoffrey I. Webb,et al.  MultiBoosting: A Technique for Combining Boosting and Wagging , 2000, Machine Learning.

[15]  Y. Freund,et al.  Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By , 2000 .

[16]  Pat Langley,et al.  Induction of One-Level Decision Trees , 1992, ML.

[17]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[18]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[19]  Aiko M. Hormann,et al.  Programs for Machine Learning. Part I , 1962, Inf. Control..

[20]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[21]  Atsuyoshi Nakamura,et al.  Bagging, Random Subspace Method and Biding , 2008, SSPR/SPR.

[22]  Raymond J. Mooney,et al.  Constructing Diverse Classifier Ensembles using Artificial Training Examples , 2003, IJCAI.

[23]  Léon Bottou,et al.  Local Learning Algorithms , 1992, Neural Computation.

[24]  Lior Rokach,et al.  Ensemble-based classifiers , 2010, Artificial Intelligence Review.

[25]  Andrew W. Moore,et al.  Locally Weighted Learning for Control , 1997, Artificial Intelligence Review.

[26]  Lior Rokach,et al.  Space Decomposition in Data Mining: A Clustering Approach , 2002, ISMIS.

[27]  Chris Mellish,et al.  Advances in Instance Selection for Instance-Based Learning Algorithms , 2002, Data Mining and Knowledge Discovery.

[28]  Ethem Alpaydin,et al.  MultiStage Cascading of Multiple Classifiers: One Man's Noise is Another Man's Data , 2000, ICML.

[29]  Wei Fan,et al.  Bagging , 2009, Encyclopedia of Machine Learning.

[30]  Barak Aviad,et al.  Classification by clustering decision tree-like classifier based on adjusted clusters , 2011 .

[31]  Kagan Tumer,et al.  Applications of ensemble methods , 2008, Information Fusion.

[32]  Ian Witten,et al.  Data Mining , 2000 .

[33]  Thomas G. Dietterich Multiple Classifier Systems , 2000, Lecture Notes in Computer Science.