The Primal-Dual Model of an Objective Function
暂无分享,去创建一个
In the previous chapters, we have proved that in the Black-Box framework the non-smooth optimization problems are much more difficult than the smooth ones. However, very often we know the explicit structure of the functional components. In this chapter we show how this knowledge can be used to accelerate the minimization methods and to extract a useful information about the dual counterpart of the problem. The main acceleration idea is based on the approximation of a nondifferentiable function by a differentiable one. We develop a technique for creating computable smoothed versions of non-differentiable functions and minimize them by Fast Gradient Methods. The number of iterations of the resulting methods is proportional to the square root of the number of iterations of the standard subgradient scheme. At the same time, the complexity of each iteration does not change. This technique can be used either in the primal form, or in the symmetric primal-dual form. We include in this chapter an example of application of this approach to the problem of Semidefinite Optimization. The chapter is concluded by analysis of performance of the Conditional Gradient method, which is based only on solving at each iteration an auxiliary problem of minimization of a linear function. We show that this method can also reconstruct the primal-dual solution of the problem. A similar idea is used in the second-order Trust Region Method with contraction, the first method of this type with provable global worst-case performance guarantees.