Recognizing Underlying Sparsity in Optimization B-428 Recognizing Underlying Sparsity in Optimization

Exploiting sparsity is essential to improve the efficiency of solving large optimization problems. We present a method for recognizing the underlying sparsity structure of a nonlinear partially separable problem, and show how the sparsity of the Hessian matrices of the problem’s functions can be improved by performing a nonsingular linear transformation in the space corresponding to the vector of variables. A combinatorial optimization problem is then formulated to increase the number of zeros of the Hessian matrices in the resulting transformed space, and a heuristic greedy algorithm is applied to this formulation. The resulting method can thus be viewed as a preprocessor for converting a problem with hidden sparsity into one in which sparsity is explicit. When it is combined with the sparse semidefinite programming (SDP) relaxation by Waki et al. for polynomial optimization problems (POPs), the proposed method is shown to extend the performance and applicability of this relaxation technique. Preliminary numerical results are presented to illustrate this claim. ] This manuscript was also issued as Report 06/02, Department of Mathematics, University of Namur, 61, rue de Bruxelles, B-5000 Namur, Belgium, EU. ? Department of Mathematics, Ewha Women’s University, 11-1 Dahyun-dong, Sudaemoon-gu, Seoul 120-750 Korea. The research was supported by Kosef R012005-000-10271-0. skim@ewha.ac.kr † Department of Mathematical and Computing Sciences, Tokyo Institute of Technology, 2-12-1 Oh-Okayama, Meguro-ku, Tokyo 152-8552 Japan. This research was supported by Grant-in-Aid for Scientific Research on Priority Areas 16016234. kojima@is.titech.ac.jp ‡ Department of Mathematics, The University of Namur, 61, rue de Bruxelles, B5000 Namur, Belgium, EU. philippe.toint@fundp.ac.be

[1]  Masakazu Muramatsu,et al.  SparsePOP: a Sparse Semidefinite Programming Relaxation of Polynomial Optimization Problems , 2005 .

[2]  Nobuo Yamashita,et al.  Analysis of Sparse Quasi-Newton Updates with Positive Definite Matrix Completion , 2014, Journal of the Operations Research Society of China.

[3]  Masakazu Muramatsu,et al.  Sums of Squares and Semidefinite Programming Relaxations for Polynomial Optimization Problems with Structured Sparsity , 2004 .

[4]  Masakazu Kojima,et al.  Sparsity in sums of squares of polynomials , 2005, Math. Program..

[5]  Masakazu Kojima,et al.  Generalized Lagrangian Duals and Sums of Squares Relaxations of Sparse Polynomial Optimization Problems , 2005, SIAM J. Optim..

[6]  Nicholas I. M. Gould,et al.  GALAHAD, a library of thread-safe Fortran 90 packages for large-scale nonlinear optimization , 2003, TOMS.

[7]  D K Smith,et al.  Numerical Optimization , 2001, J. Oper. Res. Soc..

[8]  Jean B. Lasserre,et al.  Global Optimization with Polynomials and the Problem of Moments , 2000, SIAM J. Optim..

[9]  Jos F. Sturm,et al.  A Matlab toolbox for optimization over symmetric cones , 1999 .

[10]  D. Gay Automatically Finding and Exploiting Partially Separable Structure in Nonlinear Programming Problems , 1996 .

[11]  P. Toint,et al.  Improving the Decomposition of Partially Separable Functions in the Context of Large-Scale Optimization: a First Approach· , 1993 .

[12]  B. Peyton,et al.  An Introduction to Chordal Graphs and Clique Trees , 1993 .

[13]  Andreas Griewank,et al.  On the existence of convex decompositions of partially separable functions , 1984, Math. Program..

[14]  P. Toint,et al.  Partitioned variable metric updates for large structured optimization problems , 1982 .

[15]  Andreas Griewank,et al.  On the unconstrained optimization of partially separable functions , 1982 .

[16]  Jorge J. Moré,et al.  Testing Unconstrained Optimization Software , 1981, TOMS.