Accelerating difficulty estimation for conformal regression forests

The conformal prediction framework allows for specifying the probability of making incorrect predictions by a user-provided confidence level. In addition to a learning algorithm, the framework requires a real-valued function, called nonconformity measure, to be specified. The nonconformity measure does not affect the error rate, but the resulting efficiency, i.e., the size of output prediction regions, may vary substantially. A recent large-scale empirical evaluation of conformal regression approaches showed that using random forests as the learning algorithm together with a nonconformity measure based on out-of-bag errors normalized using a nearest-neighbor-based difficulty estimate, resulted in state-of-the-art performance with respect to efficiency. However, the nearest-neighbor procedure incurs a significant computational cost. In this study, a more straightforward nonconformity measure is investigated, where the difficulty estimate employed for normalization is based on the variance of the predictions made by the trees in a forest. A large-scale empirical evaluation is presented, showing that both the nearest-neighbor-based and the variance-based measures significantly outperform a standard (non-normalized) nonconformity measure, while no significant difference in efficiency between the two normalized approaches is observed. The evaluation moreover shows that the computational cost of the variance-based measure is several orders of magnitude lower than when employing the nearest-neighbor-based nonconformity measure. The use of out-of-bag instances for calibration does, however, result in nonconformity scores that are distributed differently from those obtained from test instances, questioning the validity of the approach. An adjustment of the variance-based measure is presented, which is shown to be valid and also to have a significant positive effect on the efficiency. For conformal regression forests, the variance-based nonconformity measure is hence a computationally efficient and theoretically well-founded alternative to the nearest-neighbor procedure.

[1]  Vladimir Vovk,et al.  Cross-conformal predictors , 2012, Annals of Mathematics and Artificial Intelligence.

[2]  Henrik Boström,et al.  Effective utilization of data in inductive conformal prediction using ensembles of neural networks , 2013, The 2013 International Joint Conference on Neural Networks (IJCNN).

[3]  Alexander Gammerman,et al.  Learning by Transduction , 1998, UAI.

[4]  Henrik Boström,et al.  Regression conformal prediction with random forests , 2014, Machine Learning.

[5]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[6]  W. Gasarch,et al.  The Book Review Column 1 Coverage Untyped Systems Simple Types Recursive Types Higher-order Systems General Impression 3 Organization, and Contents of the Book , 2022 .

[7]  Henrik Boström Forests of Probability Estimation Trees , 2012, Int. J. Pattern Recognit. Artif. Intell..

[8]  Henrik Boström,et al.  Evaluation of a Variance-Based Nonconformity Measure for Regression Forests , 2016, COPA.

[9]  Harris Papadopoulos,et al.  Inductive Conformal Prediction: Theory and Application to Neural Networks , 2008 .

[10]  Haris Haralambous,et al.  Reliable prediction intervals with regression neural networks , 2011, Neural Networks.

[11]  Harris Papadopoulos,et al.  Inductive Confidence Machines for Regression , 2002, ECML.

[12]  Harris Papadopoulos,et al.  Normalized nonconformity measures for regression Conformal Prediction , 2008 .

[13]  Harris Papadopoulos,et al.  Regression Conformal Prediction with Nearest Neighbours , 2014, J. Artif. Intell. Res..

[14]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[15]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.