Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization

被引:0
作者
Szilárd Csaba László
机构
[1] Technical University of Cluj-Napoca,Department of Mathematics
来源
Mathematical Programming | 2021年 / 190卷
关键词
inertial algorithm; Non-convex optimization; Kurdyka–Łojasiewicz inequality; Convergence rate; Łojasiewicz exponent; 90C26; 90C30; 65K10;
D O I
暂无
中图分类号
学科分类号
摘要
We investigate an inertial algorithm of gradient type in connection with the minimization of a non-convex differentiable function. The algorithm is formulated in the spirit of Nesterov’s accelerated convex gradient method. We prove some abstract convergence results which applied to our numerical scheme allow us to show that the generated sequences converge to a critical point of the objective function, provided a regularization of the objective function satisfies the Kurdyka–Łojasiewicz property. Further, we obtain convergence rates for the generated sequences and the objective function values formulated in terms of the Łojasiewicz exponent of a regularization of the objective function. Finally, some numerical experiments are presented in order to compare our numerical scheme and some algorithms well known in the literature.
引用
收藏
页码:285 / 329
页数:44
相关论文
共 50 条
[21]   Projected Gradient Descent for Non-Convex Sparse Spike Estimation [J].
Traonmilin, Yann ;
Aujol, Jean-Francois ;
Leclaire, Arthur .
IEEE SIGNAL PROCESSING LETTERS, 2020, 27 :1110-1114
[22]   Convergence rates for the heavy-ball continuous dynamics for non-convex optimization, under Polyak–Łojasiewicz condition [J].
Vassilis Apidopoulos ;
Nicolò Ginatta ;
Silvia Villa .
Journal of Global Optimization, 2022, 84 :563-589
[23]   A Non-Euclidean Gradient Descent Framework for Non-Convex Matrix Factorization [J].
Hsieh, Ya-Ping ;
Kao, Yu-Chun ;
Mahabadi, Rabeeh Karimi ;
Yurtsever, Alp ;
Kyrillidis, Anastasios ;
Cevher, Volkan .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2018, 66 (22) :5917-5926
[24]   Convergence analysis of AdaBound with relaxed bound functions for non-convex optimization [J].
Liu, Jinlan ;
Kong, Jun ;
Xu, Dongpo ;
Qi, Miao ;
Lu, Yinghua .
NEURAL NETWORKS, 2022, 145 :300-307
[25]   Gradient-type penalty method with inertial effects for solving constrained convex optimization problems with smooth data [J].
Bot, Radu Ioan ;
Csetnek, Erno Robert ;
Nimana, Nimit .
OPTIMIZATION LETTERS, 2018, 12 (01) :17-33
[26]   Gradient-type penalty method with inertial effects for solving constrained convex optimization problems with smooth data [J].
Radu Ioan Boţ ;
Ernö Robert Csetnek ;
Nimit Nimana .
Optimization Letters, 2018, 12 :17-33
[27]   Byzantine Resilient Non-Convex SCSG With Distributed Batch Gradient Computations [J].
Bulusu, Saikiran ;
Khanduri, Prashant ;
Kafle, Swatantra ;
Sharma, Pranay ;
Varshney, Pramod K. .
IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2021, 7 :754-766
[28]   Convergence of Adam for non-convex objectives: relaxed hyperparameters and non-ergodic case [J].
Liang, Yuqing ;
He, Meixuan ;
Liu, Jinlan ;
Xu, Dongpo .
MACHINE LEARNING, 2025, 114 (03)
[29]   Convergence rates for the heavy-ball continuous dynamics for non-convex optimization, under Polyak-Lojasiewicz condition [J].
Apidopoulos, Vassilis ;
Ginatta, Nicolo ;
Villa, Silvia .
JOURNAL OF GLOBAL OPTIMIZATION, 2022, 84 (03) :563-589
[30]   Damping proximal coordinate descent algorithm for non-convex regularization [J].
Pan, Zheng ;
Lin, Ming ;
Hou, Guangdong ;
Zhang, Changshui .
NEUROCOMPUTING, 2015, 152 :151-163