Imputing missing values with unsupervised random trees

This work proposes a non-iterative strategy for missing value imputations which is guided by similarity between observations, but instead of explicitly determining distances or nearest neighbors, it assigns observations to overlapping buckets through recursive semi-random hyperplane cuts, in which weighted averages are determined as imputations for each variable. The quality of these imputations is oftentimes not as good as that of chained equations, but the proposed technique is much faster, non-iterative, can make imputations on new data without re-calculating anything, and scales easily to large and high-dimensional datasets, providing a significant boost over simple mean/median imputation in regression and classification metrics with imputed values when other methods are not feasible.

[1]  Zhi-Hua Zhou,et al.  Isolation Forest , 2008, 2008 Eighth IEEE International Conference on Data Mining.

[2]  Stef van Buuren,et al.  MICE: Multivariate Imputation by Chained Equations in R , 2011 .

[3]  F. Maxwell Harper,et al.  The MovieLens Datasets: History and Context , 2016, TIIS.

[4]  Robert J. Brunner,et al.  Extended Isolation Forest , 2018, IEEE Transactions on Knowledge and Data Engineering.

[5]  David Cortes Cold-start recommendations in Collective Matrix Factorization , 2018, ArXiv.

[6]  J. Ross Quinlan,et al.  C4.5: Programs for Machine Learning , 1992 .

[7]  Max Kuhn,et al.  Applied Predictive Modeling , 2013 .

[8]  Alberto Maria Segre,et al.  Programs for Machine Learning , 1994 .

[9]  Leo Breiman,et al.  Classification and Regression Trees , 1984 .

[10]  Anna Veronika Dorogush,et al.  CatBoost: gradient boosting with categorical features support , 2018, ArXiv.

[11]  David Cortes Distance approximation using Isolation Forests , 2019, ArXiv.

[12]  Peter Bühlmann,et al.  MissForest - non-parametric missing value imputation for mixed-type data , 2011, Bioinform..

[13]  Patrick Seemann,et al.  Matrix Factorization Techniques for Recommender Systems , 2014 .

[14]  Andy Liaw,et al.  Classification and Regression by randomForest , 2007 .

[15]  J. Ross Quinlan,et al.  Unknown Attribute Values in Induction , 1989, ML.

[16]  Zhi-Hua Zhou,et al.  On Detecting Clustered Anomalies Using SCiForest , 2010, ECML/PKDD.

[17]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[18]  Tianqi Chen,et al.  XGBoost: A Scalable Tree Boosting System , 2016, KDD.

[19]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.