An inexact projected gradient method with rounding and lifting by nonlinear programming for solving rank-one semidefinite relaxation of polynomial optimization

被引:14
作者
Yang, Heng [1 ]
Liang, Ling [2 ]
Carlone, Luca [3 ]
Toh, Kim-Chuan [4 ,5 ]
机构
[1] NVIDIA Res, Santa Clara, CA USA
[2] Natl Univ Singapore, Dept Math, Singapore, Singapore
[3] MIT, Lab Informat & Decis Syst, Cambridge, MA USA
[4] Natl Univ Singapore, Dept Math, Singapore, Singapore
[5] Natl Univ Singapore, Inst Operat Res & Analyt, Singapore, Singapore
关键词
Semidefinite programming; Polynomial optimization; Inexact projected gradient method; Rank-one solutions; Nonlinear programming; Degeneracy; AUGMENTED LAGRANGIAN METHOD; MOMENT-SOS HIERARCHY; GLOBAL OPTIMIZATION; SQUARES; ALGORITHM; MATRIX; APPROXIMATION; SOFTWARE; SPARSITY; MATLAB;
D O I
10.1007/s10107-022-01912-6
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
We consider solving high-order and tight semidefinite programming (SDP) relaxations of nonconvex polynomial optimization problems (POPs) that often admit degenerate rank-one optimal solutions. Instead of solving the SDP alone, we propose a new algorithmic framework that blends local search using the nonconvex POP into global descent using the convex SDP. In particular, we first design a globally convergent inexact projected gradient method (iPGM) for solving the SDP that serves as the backbone of our framework. We then accelerate iPGM by taking long, but safeguarded, rank-one steps generated by fast nonlinear programming algorithms. We prove that the new framework is still globally convergent for solving the SDP. To solve the iPGM subproblem of projecting a given point onto the feasible set of the SDP, we design a two-phase algorithm with phase one using a symmetric Gauss-Seidel based accelerated proximal gradient method (sGS-APG) to generate a good initial point, and phase two using a modified limited-memory BFGS (L-BFGS) method to obtain an accurate solution. We analyze the convergence for both phases and establish a novel global convergence result for the modified L-BFGS that does not require the objective function to be twice continuously differentiable. We conduct numerical experiments for solving second-order SDP relaxations arising from a diverse set of POPs. Our framework demonstrates state-of-the-art efficiency, scalability, and robustness in solving degenerate SDPs to high accuracy, even in the presence of millions of equality constraints.
引用
收藏
页码:409 / 472
页数:64
相关论文
共 92 条
[11]  
BenRosen J, 1996, SIAM J MATRIX ANAL A, V17, P110
[12]  
Bertsekas DP., 1999, Athena scientific optimization and computation series
[13]  
Blekherman G., 2020, SEMIDEFINITE OPTIMIZ
[14]  
Boumal N., 2016, Advances in Neural Information Processing Systems, P2757
[15]  
Boumal N, 2014, J MACH LEARN RES, V15, P1455
[16]   Convex Global 3D Registration with Lagrangian Duality [J].
Briales, Jesus ;
Gonzalez-Jimenez, Javier .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :5612-5621
[17]   A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization [J].
Burer, S ;
Monteiro, RDC .
MATHEMATICAL PROGRAMMING, 2003, 95 (02) :329-357
[18]   Exact semidefinite formulations for a class of (random and non-random) nonconvex quadratic programs [J].
Burer, Samuel ;
Ye, Yinyu .
MATHEMATICAL PROGRAMMING, 2020, 181 (01) :1-17
[19]   Phase Retrieval via Matrix Completion [J].
Candes, Emmanuel J. ;
Eldar, Yonina C. ;
Strohmer, Thomas ;
Voroninski, Vladislav .
SIAM REVIEW, 2015, 57 (02) :225-251
[20]   An efficient inexact symmetric Gauss-Seidel based majorized ADMM for high-dimensional convex composite conic programming [J].
Chen, Liang ;
Sun, Defeng ;
Toh, Kim-Chuan .
MATHEMATICAL PROGRAMMING, 2017, 161 (1-2) :237-270