An algorithm for computing estimators that optimize step functions

The average of a large number of random step functions produces a discontinuous surface with a large number of local optima although it may converge to a smooth surface with a unique optimum as the number of step functions tends to infinity. Such a function arises when certain types of econometric estimators are used, including variants of the maximum score estimator. I propose an algorithm for computing the optimum of such a surface, where standard gradient-based optimization methods are inapplicable. This algorithm replaces the discontinuous surface with a sequence of easily optimized continuous surfaces that converge to it. Sufficient conditions for the algorithm to converge to a global optimum are given and the algorithm's performance is evaluated in a simple but relevant application.