Recognizing underlying sparsity in optimization

Exploiting sparsity is essential to improve the efficiency of solving large optimization problems. We present a method for recognizing the underlying sparsity structure of a nonlinear partially separable problem, and show how the sparsity of the Hessian matrices of the problem’s functions can be improved by performing a nonsingular linear transformation in the space corresponding to the vector of variables. A combinatorial optimization problem is then formulated to increase the number of zeros of the Hessian matrices in the resulting transformed space, and a heuristic greedy algorithm is applied to this formulation. The resulting method can thus be viewed as a preprocessor for converting a problem with hidden sparsity into one in which sparsity is explicit. When it is combined with the sparse semidefinite programming relaxation by Waki et al. for polynomial optimization problems, the proposed method is shown to extend the performance and applicability of this relaxation technique. Preliminary numerical results are presented to illustrate this claim.

[1]  Murray Hill Automatically Finding and Exploiting Partially Separable Structure in Nonlinear Programming Problems , 1996 .

[2]  P. Toint,et al.  Local convergence analysis for partitioned quasi-Newton updates , 1982 .

[3]  Andreas Griewank,et al.  On the unconstrained optimization of partially separable functions , 1982 .

[4]  Masakazu Muramatsu,et al.  A note on sparse SOS and SDP relaxations for polynomial optimization problems over symmetric cones , 2009, Comput. Optim. Appl..

[5]  Jorge J. Moré,et al.  Testing Unconstrained Optimization Software , 1981, TOMS.

[6]  Nicholas I. M. Gould,et al.  GALAHAD, a library of thread-safe Fortran 90 packages for large-scale nonlinear optimization , 2003, TOMS.

[7]  I M GouldNicholas,et al.  CUTEr and SifDec , 2003 .

[8]  B. Peyton,et al.  An Introduction to Chordal Graphs and Clique Trees , 1993 .

[9]  Masakazu Kojima,et al.  Generalized Lagrangian Duals and Sums of Squares Relaxations of Sparse Polynomial Optimization Problems , 2005, SIAM J. Optim..

[10]  D. Gay Automatically Finding and Exploiting Partially Separable Structure in Nonlinear Programming Problems , 1996 .

[11]  P. Toint,et al.  Improving the Decomposition of Partially Separable Functions in the Context of Large-Scale Optimization: a First Approach· , 1993 .

[12]  M. J. D. Powell,et al.  Nonlinear optimization, 1981 , 1982 .

[13]  A. George,et al.  Graph theory and sparse matrix computation , 1993 .

[14]  Masakazu Muramatsu,et al.  SparsePOP: a Sparse Semidefinite Programming Relaxation of Polynomial Optimization Problems , 2005 .

[15]  Nobuo Yamashita,et al.  Analysis of Sparse Quasi-Newton Updates with Positive Definite Matrix Completion , 2014, Journal of the Operations Research Society of China.

[16]  Jos F. Sturm,et al.  A Matlab toolbox for optimization over symmetric cones , 1999 .

[17]  Masakazu Kojima,et al.  Sparsity in sums of squares of polynomials , 2005, Math. Program..

[18]  Nicholas I. M. Gould,et al.  Lancelot: A FORTRAN Package for Large-Scale Nonlinear Optimization (Release A) , 1992 .

[19]  Masakazu Muramatsu,et al.  Sums of Squares and Semidefinite Programming Relaxations for Polynomial Optimization Problems with Structured Sparsity , 2004 .

[20]  Jean B. Lasserre,et al.  Global Optimization with Polynomials and the Problem of Moments , 2000, SIAM J. Optim..

[21]  Nicholas I. M. Gould,et al.  CUTEr and SifDec: A constrained and unconstrained testing environment, revisited , 2003, TOMS.

[22]  P. Toint,et al.  Partitioned variable metric updates for large structured optimization problems , 1982 .

[23]  Stephen J. Wright,et al.  Numerical Optimization , 2018, Fundamental Statistical Inference.

[24]  Andreas Griewank,et al.  On the existence of convex decompositions of partially separable functions , 1984, Math. Program..

[25]  W. Hager,et al.  Large Scale Optimization : State of the Art , 1993 .