共 15 条
CONVERGENCE PROPERTIES OF PROXIMAL (SUB)GRADIENT METHODS WITHOUT CONVEXITY OR SMOOTHNESS OF ANY OF THE FUNCTIONS
被引:0
|作者:
Solodov, Mikhail, V
[1
]
机构:
[1] IMPA Inst Matemat Pura & Aplicada, Estrada Dona Castronia, BR-22460320 Rio De Janeiro, RJ, Brazil
关键词:
proximal gradient methods;
incremental methods;
nonsmooth nonconvex optimization;
MINIMIZATION;
OPTIMIZATION;
ALGORITHMS;
NONCONVEX;
D O I:
10.1137/23M1592158
中图分类号:
O29 [应用数学];
学科分类号:
070104 ;
摘要:
We establish convergence properties for a framework that includes a variety of proximal subgradient methods, where none of the involved functions needs to be convex or differentiable. The functions are assumed to be Clarke-regular. Our results cover the projected and conditional variants for the constrained case, the use of the inertial/momentum terms, and incremental methods when each of the functions is itself a sum, and the methods process the components in this sum separately.
引用
收藏
页码:28 / 41
页数:14
相关论文