Accelerated Difference of Convex functions Algorithm and its Application to Sparse Binary Logistic Regression
暂无分享,去创建一个
In this work, we present a variant of DCA (Differ-ence of Convex function Algorithm) with the aim of improving its performance. The proposed algorithm , named Accelerated DCA (ADCA), consists in incorporating the Nesterov's acceleration technique into DCA. We first investigate ADCA for solving the standard DC program and rigorously study its convergence properties and the convergence rate. Secondly, we develop ADCA for a special case of the standard DC program whose the objective function is the sum of a differentiable function with L-Lipschitz continuous gradient (possi-bly nonconvex) and a DC function. We exploit the special structure of the problem to propose an efficient DC decomposition for which the corresponding ADCA scheme is inexpensive. As an application , we consider the sparse binary logistic regression problem. Numerical experiments on several benchmark datasets illustrate the efficiency of our algorithm and its superiority over well-known methods.