An efficient uni-representation approach towards combining machine learners

In this paper, we present a novel approach towards combining various machine learners. Our novel approach shows an increase in the accuracy for solving the classification problems in machine learning. We first present a technique of combining learners and also show its implementation using Python programming and then show its comparison with other learners. Later we discuss feature space design and show its implementation on our new approach of combining learners. In Section I we have first provided an idea about the language (Python) we have used for implementing our technique and the machine learning tool we used for accessing the learning algorithms. Section II and Section III provide an idea about the concept of combining learners and various types of combination techniques. In Section IV we discuss our technique, its procedure, experiment and the results. Section V presents the feature space design, feature selection techniques, steps of feature selection method used, experiment and results.

[1]  Thomas G. Dietterich,et al.  Learning with Many Irrelevant Features , 1991, AAAI.

[2]  Ron Kohavi,et al.  Wrappers for Feature Subset Selection , 1997, Artif. Intell..

[3]  Dimitris Kanellopoulos,et al.  Data Preprocessing for Supervised Leaning , 2007 .

[4]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[5]  Mehnaz Khan Evaluating Various Learning Techniques for Efficiency , 2012 .

[6]  Ron Kohavi,et al.  Irrelevant Features and the Subset Selection Problem , 1994, ICML.

[7]  Larry A. Rendell,et al.  The Feature Selection Problem: Traditional Methods and a New Algorithm , 1992, AAAI.

[8]  Gustavo E. A. P. A. Batista,et al.  An analysis of four missing data treatment methods for supervised learning , 2003, Appl. Artif. Intell..

[9]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[10]  C. Kaynak,et al.  Techniques for Combining Multiple Learners , 1998 .

[11]  R. Schapire The Strength of Weak Learnability , 1990, Machine Learning.

[12]  Dr. Alex A. Freitas Data Mining and Knowledge Discovery with Evolutionary Algorithms , 2002, Natural Computing Series.

[13]  Dorian Pyle,et al.  Data Preparation for Data Mining , 1999 .

[14]  Victoria J. Hodge,et al.  A Survey of Outlier Detection Methodologies , 2004, Artificial Intelligence Review.

[15]  Thomas G. Dietterich,et al.  Learning Boolean Concepts in the Presence of Many Irrelevant Features , 1994, Artif. Intell..

[16]  Andrew W. Moore,et al.  Efficient Algorithms for Minimizing Cross Validation Error , 1994, ICML.