A class of smooth exact penalty function methods for optimization problems with orthogonality constraints

被引:24
作者
Xiao, Nachuan [1 ,2 ]
Liu, Xin [1 ,2 ]
Yuan, Ya-xiang [1 ]
机构
[1] Chinese Acad Sci, Acad Math & Syst Sci, State Key Lab Sci & Engn Comp, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Math Sci, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Orthogonality constraint; Stiefel manifold; augmented Lagrangian method; ALGORITHMS; MINIMIZATION; FRAMEWORK; ENERGY;
D O I
10.1080/10556788.2020.1852236
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Updating the augmented Lagrangian multiplier by closed-form expression yields efficient first-order infeasible approach for optimization problems with orthogonality constraints. Hence, parallelization becomes tractable in solving this type of problems. Inspired by this closed-form updating scheme, we propose a novel penalty function with compact convex constraints (PenC). We show that PenC can act as an exact penalty model which shares the same global minimizers as the original problem with orthogonality constraints. Based on PenC, we first propose a first-order algorithm called PenCF and establish its global convergence and local linear convergence rate under some mild assumptions. For the case that the computation and storage of Hessian is achievable, and we pursue high precision solution and fast local convergence rate, a second-order approach called PenCS is proposed for solving PenC. To avoid expensive calculation or solving a hard subproblem in computing the Newton step, we propose a new strategy to do it approximately which still leads to quadratic convergence locally. Moreover, the main iterations of both PenCF and PenCS are orthonormalization-free and hence parallelizable. Numerical experiments illustrate that PenCF is comparable with the existing first-order methods. Furthermore, PenCS shows its stability and high efficiency in obtaining high precision solution comparing with the existing second-order methods.
引用
收藏
页码:1205 / 1241
页数:37
相关论文
共 47 条
[1]   Conjugate gradient algorithm for optimization under unitary matrix constraint [J].
Abrudan, Traian ;
Eriksson, Jan ;
Koivunen, Visa .
SIGNAL PROCESSING, 2009, 89 (09) :1704-1714
[2]   Steepest descent algorithms for optimization under unitary matrix constraint [J].
Abrudan, Traian E. ;
Eriksson, Jan ;
Koivunen, Visa .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2008, 56 (03) :1134-1147
[3]   Trust-region methods on Riemannian manifolds [J].
Absil, P-A. ;
Baker, C. G. ;
Gallivan, K. A. .
FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2007, 7 (03) :303-330
[4]  
Absil PA, 2008, OPTIMIZATION ALGORITHMS ON MATRIX MANIFOLDS, P1
[5]  
[Anonymous], 2011, Neural Information Processing Systems
[6]  
[Anonymous], 2015, J. Machine Learning Research
[7]  
[Anonymous], 2012, P 18 ACM SIGKDD INT
[8]  
Bansal N., 2018, ADV NEUR IN, P4261
[9]  
Bertsekas D.P., 2014, Constrained optimization and Lagrange multiplier methods
[10]  
Boumal N, 2014, J MACH LEARN RES, V15, P1455