Bagging is one of the most effective computationally intensive procedures to improve on unstable estimators or classifiers, useful especially for high dimensional data set problems. Here we formalize the notion of instability and derive theoretical results to analyze the variance reduction effect of bagging (or variants thereof) in mainly hard decision problems, which include estimation after testing in regression and decision trees for regression functions and classifiers. Hard decisions create instability, and bagging is shown to smooth such hard decisions, yielding smaller variance and mean squared error. With theoretical explanations, we motivate subagging based on subsampling as an alternative aggregation scheme. It is computationally cheaper but still shows approximately the same accuracy as bagging. Moreover, our theory reveals improvements in first order and in line with simulation studies. In particular, we obtain an asymptotic limiting distribution at the cube-root rate for the split point when fitting piecewise constant functions. Denoting sample size by n, it follows that in a cylindric neighborhood of diameter n−1/3 of the theoretically optimal split point, the variance and mean squared error reduction of subagging can be characterized analytically. Because of the slow rate, our reasoning also provides an explanation on the global scale for the whole covariate space in a decision tree with finitely many splits.
[1]
Ing Rj Ser.
Approximation Theorems of Mathematical Statistics
,
1980
.
[2]
P. Groeneboom.
Brownian motion with a parabolic drift and airy functions
,
1989
.
[3]
E. Giné,et al.
Bootstrapping General Empirical Measures
,
1990
.
[4]
D. Pollard,et al.
Cube Root Asymptotics
,
1990
.
[5]
D. Pollard.
Empirical Processes: Theory and Applications
,
1990
.
[6]
R. Tibshirani,et al.
Flexible Discriminant Analysis by Optimal Scoring
,
1994
.
[7]
L. Breiman.
Heuristics of instability and stabilization in model selection
,
1996
.
[8]
W. Loh,et al.
SPLIT SELECTION METHODS FOR CLASSIFICATION TREES
,
1997
.
[9]
Yali Amit,et al.
Shape Quantization and Recognition with Randomized Trees
,
1997,
Neural Computation.
[10]
Kung-Sik Chan,et al.
Limiting properties of the least squares estimator of a continuous threshold autoregressive model
,
1998
.
[11]
Diane Crawford,et al.
Editorial
,
2000,
CACM.
[12]
Andreas Buja,et al.
Smoothing Effects of Bagging
,
2000
.
[13]
J. Fox.
Bootstrapping Regression Models
,
2002
.