A composite splitting criterion using random sampling

The ever growing presence of data lead to a large number of proposed algorithms for classification and especially decision trees over the last few years. However, learning decision trees from large irrelevant datasets is quite different from learning small and moderate sized datasets. In practice, use of only small and moderate sized datasets is rare. Unfortunately, the most popular heuristic function gain ratio has a serious disadvantage towards dealing with large and irrelevant datasets. To tackle these issues, we design a new composite splitting criterion with random sampling approach. Our random sampling method depends on small random subset of attributes and it is computationally cheap to act on such a set in a reasonable time. The empirical and theoretical properties are validated by using 40 UCI datasets. The experimental result supports the efficacy of the proposed method in terms of tree size and accuracy.

[1]  W. R. Garner Applications of Information Theory to Psychology , 1959 .

[2]  Usama M. Fayyad,et al.  The Attribute Selection Problem in Decision Tree Generation , 1992, AAAI.

[3]  J. Ross Quinlan,et al.  Induction of Decision Trees , 1986, Machine Learning.

[4]  J. Ross Quinlan,et al.  Simplifying Decision Trees , 1987, Int. J. Man Mach. Stud..

[5]  Rich Caruana,et al.  Greedy Attribute Selection , 1994, ICML.

[6]  Ali Mirza Mahmood,et al.  Generating Optimized Decision Tree Based on Discrete Wavelet Transform , 2010 .

[7]  Aiko M. Hormann,et al.  Programs for Machine Learning. Part I , 1962, Inf. Control..

[8]  Ian H. Witten,et al.  Data mining: practical machine learning tools and techniques, 3rd Edition , 1999 .

[9]  Hiroshi Motoda,et al.  Computational Methods of Feature Selection , 2022 .

[10]  Jerome H. Friedman,et al.  A Recursive Partitioning Decision Rule for Nonparametric Classification , 1977, IEEE Transactions on Computers.

[11]  Wei-Yin Loh,et al.  Classification and regression trees , 2011, WIREs Data Mining Knowl. Discov..

[12]  Jie Ouyang,et al.  Induction of multiclass multifeature split decision trees from distributed data , 2009, Pattern Recognit..

[13]  Ron Kohavi,et al.  Wrappers for Feature Subset Selection , 1997, Artif. Intell..

[14]  Chen Jin,et al.  An improved ID3 decision tree algorithm , 2009, 2009 4th International Conference on Computer Science & Education.

[15]  Ramón López de Mántaras,et al.  A distance-based attribute selection measure for decision tree induction , 1991, Machine Learning.

[16]  Ali Mirza Mahmood,et al.  A New Decision Tree Induction Using Composite Splitting Criterion , 2012 .