A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization

It is well known that a possibly nondifferentiable convex minimization problem can be transformed into a differentiable convex minimization problem by way of the Moreau--Yosida regularization. This paper presents a globally convergent algorithm that is designed to solve the latter problem. Under additional semismoothness and regularity assumptions, the proposed algorithm is shown to have a Q-superlinear rate of convergence.