Optimal Multiple Intervals Discretization of Continuous Attributes for Supervised Learning

In this paper, we propose an extension of Fischer's algorithm to compute the optimal discretization of a continuous variable in the context of supervised learning. Our algorithm is extremely performant since its only depends on the number of runs and not directly on the number of points of the sample data set. We propose an empirical comparison between the optimal algorithm and two hill climbing heuristics.