A Mathematical Assessment of the Isolation Tree Method for Outliers Detection in Big Data

In this paper, the mathematical analysis of the Isolation Random Forest Method (IRF Method) for anomaly detection is presented. We show that the IRF space can be endowed with a probability induced by the Isolation Tree algorithm (iTree). In this setting, the convergence of the IRF method is proved using the Law of Large Numbers. A couple of counterexamples are presented to show that the original method is inconclusive and no quality certificate can be given, when using it as a means to detect anomalies. Hence, an alternative version of IRF is proposed, whose mathematical foundation, as well as its limitations, are fully justified. Finally, numerical experiments are presented to compare the performance of the classic IRF with the proposed one.