A 2-Stage Algorithm for Minimax Optimization

The problem of minimizing the maximum of a finite set of smooth functions can be solved by a method that uses only first order derivative information, and normally this method will have a quadratic final rate of convergence. However, if some regularity condition is not fulfilled at the solution then second order information is required in order to obtain a fast final convergence. We present a method which combines the two types of algorithms. If an irregularity is detected a switch is made from the first order method to a method which is based on approximations of the second order information using only first derivatives. We prove that the combined method has sure convergence properties and illustrate by some numerical examples.