On the softplus penalty for large-scale convex optimization

被引:0
作者
Li, Meng [1 ]
Grigas, Paul [1 ]
Atamturk, Alper [1 ]
机构
[1] Univ Calif Berkeley, Dept Ind Engn & Operat Res, Berkeley, CA 94720 USA
基金
美国国家科学基金会;
关键词
Convex optimization; Penalty method; Gradient method;
D O I
10.1016/j.orl.2023.10.015
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We study a penalty reformulation of constrained convex optimization based on the softplus penalty function. For strongly convex objectives, we develop upper bounds on the objective value gap and the violation of constraints for the solutions to the penalty reformulations by analyzing the solution path of the reformulation with respect to the smoothness parameter. We use these upper bounds to analyze the complexity of applying gradient methods, which are advantageous when the number of constraints is large, to the reformulation. (c) 2023 Elsevier B.V. All rights reserved.
引用
收藏
页码:666 / 672
页数:7
相关论文
共 18 条
[11]  
Lin HZ, 2018, J MACH LEARN RES, V18
[12]  
Mishchenko K, 2018, Arxiv, DOI arXiv:1810.13387
[13]  
Nedic A, 2020, IEEE DECIS CONTR P, P372, DOI 10.1109/CDC42340.2020.9303832
[14]   Smooth minimization of non-smooth functions [J].
Nesterov, Y .
MATHEMATICAL PROGRAMMING, 2005, 103 (01) :127-152
[15]   Gradient methods for minimizing composite functions [J].
Nesterov, Yu .
MATHEMATICAL PROGRAMMING, 2013, 140 (01) :125-161
[16]   Computation of minimum-volume covering ellipsoids [J].
Sun, P ;
Freund, RM .
OPERATIONS RESEARCH, 2004, 52 (05) :690-706
[17]   A SMOOTH INEXACT PENALTY REFORMULATION OF CONVEX PROBLEMS WITH LINEAR CONSTRAINTS\ast [J].
Tatarenko, Tatiana ;
Nedich, Angelia .
SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (03) :2141-2170
[18]   Iteration complexity of inexact augmented Lagrangian methods for constrained convex programming [J].
Xu, Yangyang .
MATHEMATICAL PROGRAMMING, 2021, 185 (1-2) :199-244