An inexact projected gradient method with rounding and lifting by nonlinear programming for solving rank-one semidefinite relaxation of polynomial optimization

被引:14
作者
Yang, Heng [1 ]
Liang, Ling [2 ]
Carlone, Luca [3 ]
Toh, Kim-Chuan [4 ,5 ]
机构
[1] NVIDIA Res, Santa Clara, CA USA
[2] Natl Univ Singapore, Dept Math, Singapore, Singapore
[3] MIT, Lab Informat & Decis Syst, Cambridge, MA USA
[4] Natl Univ Singapore, Dept Math, Singapore, Singapore
[5] Natl Univ Singapore, Inst Operat Res & Analyt, Singapore, Singapore
关键词
Semidefinite programming; Polynomial optimization; Inexact projected gradient method; Rank-one solutions; Nonlinear programming; Degeneracy; AUGMENTED LAGRANGIAN METHOD; MOMENT-SOS HIERARCHY; GLOBAL OPTIMIZATION; SQUARES; ALGORITHM; MATRIX; APPROXIMATION; SOFTWARE; SPARSITY; MATLAB;
D O I
10.1007/s10107-022-01912-6
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
We consider solving high-order and tight semidefinite programming (SDP) relaxations of nonconvex polynomial optimization problems (POPs) that often admit degenerate rank-one optimal solutions. Instead of solving the SDP alone, we propose a new algorithmic framework that blends local search using the nonconvex POP into global descent using the convex SDP. In particular, we first design a globally convergent inexact projected gradient method (iPGM) for solving the SDP that serves as the backbone of our framework. We then accelerate iPGM by taking long, but safeguarded, rank-one steps generated by fast nonlinear programming algorithms. We prove that the new framework is still globally convergent for solving the SDP. To solve the iPGM subproblem of projecting a given point onto the feasible set of the SDP, we design a two-phase algorithm with phase one using a symmetric Gauss-Seidel based accelerated proximal gradient method (sGS-APG) to generate a good initial point, and phase two using a modified limited-memory BFGS (L-BFGS) method to obtain an accurate solution. We analyze the convergence for both phases and establish a novel global convergence result for the modified L-BFGS that does not require the objective function to be twice continuously differentiable. We conduct numerical experiments for solving second-order SDP relaxations arising from a diverse set of POPs. Our framework demonstrates state-of-the-art efficiency, scalability, and robustness in solving degenerate SDPs to high accuracy, even in the presence of millions of equality constraints.
引用
收藏
页码:409 / 472
页数:64
相关论文
共 92 条
[1]   Exact Recovery in the Stochastic Block Model [J].
Abbe, Emmanuel ;
Bandeira, Afonso S. ;
Hall, Georgina .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (01) :471-487
[2]  
Absil PA, 2008, OPTIMIZATION ALGORITHMS ON MATRIX MANIFOLDS, P1
[3]  
Aholt C, 2012, LECT NOTES COMPUT SC, V7572, P654, DOI 10.1007/978-3-642-33718-5_47
[4]   Primal-dual interior-point methods for semidefinite programming: Convergence rates, stability and numerical results [J].
Alizadeh, F ;
Haeberly, JPA ;
Overton, ML .
SIAM JOURNAL ON OPTIMIZATION, 1998, 8 (03) :746-768
[5]   Complementarity and nondegeneracy in semidefinite programming [J].
Alizadeh, F ;
Haeberly, JPA ;
Overton, ML .
MATHEMATICAL PROGRAMMING, 1997, 77 (02) :111-128
[6]   Outlier-Robust Estimation: Hardness, Minimally Tuned Algorithms, and Applications [J].
Antonante, Pasquale ;
Tzoumas, Vasileios ;
Yang, Heng ;
Carlone, Luca .
IEEE TRANSACTIONS ON ROBOTICS, 2022, 38 (01) :281-301
[7]  
ApS M., 2019, USERS GUIDE REF MANU, V4, P1
[8]  
Barak B, 2012, STOC'12: PROCEEDINGS OF THE 2012 ACM SYMPOSIUM ON THEORY OF COMPUTING, P307
[9]   Speeded-Up Robust Features (SURF) [J].
Bay, Herbert ;
Ess, Andreas ;
Tuytelaars, Tinne ;
Van Gool, Luc .
COMPUTER VISION AND IMAGE UNDERSTANDING, 2008, 110 (03) :346-359
[10]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202