Integrating global and local boosting

Several data analysis problems require investigations of relationships between attributes in related heterogeneous databases, where different prediction models can be more appropriate for different regions. A new technique of integrating global and local boosting is proposed. A comparison with other well known and widely used combining methods on standard benchmark datasets has shown that the proposed technique leads to more accurate results.

[1]  Eric Bauer,et al.  An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants , 1999, Machine Learning.

[2]  Raymond J. Mooney,et al.  Constructing Diverse Classifier Ensembles using Artificial Training Examples , 2003, IJCAI.

[3]  Wei Fan,et al.  Bagging , 2009, Encyclopedia of Machine Learning.

[4]  อนิรุธ สืบสิงห์,et al.  Data Mining Practical Machine Learning Tools and Techniques , 2014 .

[5]  Léon Bottou,et al.  Local Learning Algorithms , 1992, Neural Computation.

[6]  S. García,et al.  An Extension on "Statistical Comparisons of Classifiers over Multiple Data Sets" for all Pairwise Comparisons , 2008 .

[7]  Lior Rokach,et al.  Ensemble-based classifiers , 2010, Artificial Intelligence Review.

[8]  J. Ross Quinlan,et al.  C4.5: Programs for Machine Learning , 1992 .

[9]  Atsuyoshi Nakamura,et al.  Bagging, Random Subspace Method and Biding , 2008, SSPR/SPR.

[10]  Chris Mellish,et al.  Advances in Instance Selection for Instance-Based Learning Algorithms , 2002, Data Mining and Knowledge Discovery.

[11]  Bernhard Pfahringer,et al.  Locally Weighted Naive Bayes , 2002, UAI.

[12]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[13]  Andrew W. Moore,et al.  Locally Weighted Learning for Control , 1997, Artificial Intelligence Review.

[14]  Juan José Rodríguez Diez,et al.  Rotation Forest: A New Classifier Ensemble Method , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[17]  K. Lebart,et al.  Boosting Feature Selection , 2005, ICAPR.

[18]  Andrew W. Moore,et al.  Locally Weighted Learning , 1997, Artificial Intelligence Review.

[19]  Zoran Obradovic,et al.  Adaptive Boosting for Spatial Functions with Unstable Driving Attributes , 2000, PAKDD.

[20]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[21]  KohaviRon,et al.  An Empirical Comparison of Voting Classification Algorithms , 1999 .

[22]  Y. Freund,et al.  Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By , 2000 .

[23]  Thomas G. Dietterich Multiple Classifier Systems , 2000, Lecture Notes in Computer Science.

[24]  Geoffrey I. Webb,et al.  MultiBoosting: A Technique for Combining Boosting and Wagging , 2000, Machine Learning.

[25]  Tony R. Martinez,et al.  Reduction Techniques for Instance-Based Learning Algorithms , 2000, Machine Learning.

[26]  Pedro M. Domingos,et al.  On the Optimality of the Simple Bayesian Classifier under Zero-One Loss , 1997, Machine Learning.

[27]  Pat Langley,et al.  Induction of One-Level Decision Trees , 1992, ML.

[28]  Thomas G. Dietterich What is machine learning? , 2020, Archives of Disease in Childhood.

[29]  Aiko M. Hormann,et al.  Programs for Machine Learning. Part I , 1962, Inf. Control..