The standard approach of learning decision trees from histogram data is to treat the bins as independent variables. However, as the underlying dependencies among the bins might not be completely exploited by this approach, an algorithm has been proposed for learning decision trees from histogram data by considering all bins simultaneously while partitioning examples at each node of the tree. Although the algorithm has been demonstrated to improve predictive performance, its computational complexity has turned out to be a major bottleneck, in particular for histograms with a large number of bins. In this paper, we propose instead a sliding window approach to select subsets of the bins to be considered simultaneously while partitioning examples. This significantly reduces the number of possible splits to consider, allowing for substantially larger histograms to be handled. We also propose to evaluate the original bins independently, in addition to evaluating the subsets of bins when performing splits. This ensures that the information obtained by treating bins simultaneously is an additional gain compared to what is considered by the standard approach. Results of experiments on applying the new algorithm to both synthetic and real world datasets demonstrate positive results in terms of predictive performance without excessive computational cost.
[1]
George H. John,et al.
Robust Linear Discriminant Trees
,
1995,
AISTATS.
[2]
Carla E. Brodley,et al.
An Incremental Method for Finding Multivariate Splits for Decision Trees
,
1990,
ML.
[3]
Ishwar K. Sethi,et al.
Design of multicategory multifeature split decision trees using perceptron learning
,
1994,
Pattern Recognit..
[4]
W. Loh,et al.
Tree-Structured Classification via Generalized Discriminant Analysis.
,
1988
.
[5]
J. Ross Quinlan,et al.
Induction of Decision Trees
,
1986,
Machine Learning.
[6]
Henrik Boström,et al.
Learning Decision Trees from Histogram Data
,
2015,
DMIN 2015.
[7]
Leo Breiman,et al.
Classification and Regression Trees
,
1984
.