Convergence analysis of stochastic higher-order majorization-minimization algorithms

被引:0
作者
Lupu, Daniela [1 ]
Necoara, Ion [2 ,3 ]
机构
[1] Univ Politehn Bucuresti, Automat Control & Syst Engn Dept, Bucharest, Romania
[2] Romanian Acad, Gheorghe Mihoc Caius Iacob Inst Math Stat & Appl M, Bucharest, Romania
[3] Romanian Acad, Gheorghe Mihoc Caius Iacob Inst Math Stat & Appl M, Bucharest 050711, Romania
关键词
Finite-sum optimization; majorization-minimization; stochastic higher-order algorithms; minibatch; convergence rates; NEWTON METHOD; OPTIMIZATION;
D O I
10.1080/10556788.2023.2256447
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Majorization-minimization schemes are a broad class of iterative methods targeting general optimization problems, including nonconvex, nonsmooth and stochastic. These algorithms minimize successively a sequence of upper bounds of the objective function so that along the iterations the objective value decreases. We present a stochastic higher-order algorithmic framework for minimizing the average of a very large number of sufficiently smooth functions. Our stochastic framework is based on the notion of stochastic higher-order upper bound approximations of the finite-sum objective function and minibatching. We derive convergence results for nonconvex and convex optimization problems when the higher-order approximation of the objective function yields an error that is p times differentiable and has Lipschitz continuous p derivative. More precisely, for general nonconvex problems we present asymptotic stationary point guarantees and under Kurdyka-Lojasiewicz property we derive local convergence rates ranging from sublinear to linear. For convex problems with uniformly convex objective function, we derive local (super)linear convergence results for our algorithm. Compared to existing stochastic (first-order) methods, our algorithm adapts to the problem's curvature and allows using any batch size. Preliminary numerical tests support the effectiveness of our algorithmic framework.
引用
收藏
页码:384 / 413
页数:30
相关论文
共 50 条
  • [21] Modifications of higher-order convergence for solving nonlinear equations
    Liu, Xi-Lan
    Wang, Xiao-Rui
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2011, 235 (17) : 5105 - 5111
  • [22] CONVERGENCE RATES IN HOMOGENIZATION OF HIGHER-ORDER PARABOLIC SYSTEMS
    Niu, Weisheng
    Xu, Yao
    DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS, 2018, 38 (08) : 4203 - 4229
  • [23] Higher-Order Quantum-Inspired Genetic Algorithms
    Nowotniak, Robert
    Kucharski, Jacek
    FEDERATED CONFERENCE ON COMPUTER SCIENCE AND INFORMATION SYSTEMS, 2014, 2014, 2 : 465 - 470
  • [24] Efficiency of higher-order algorithms for minimizing composite functions
    Yassine Nabou
    Ion Necoara
    Computational Optimization and Applications, 2024, 87 : 441 - 473
  • [25] A NEW CONVERGENCE PROOF FOR THE HIGHER-ORDER POWER METHOD AND GENERALIZATIONS
    Uschmajew, Andre
    PACIFIC JOURNAL OF OPTIMIZATION, 2015, 11 (02): : 309 - 321
  • [26] Error bounds for parametric polynomial systems with applications to higher-order stability analysis and convergence rates
    Li, G.
    Mordukhovich, B. S.
    Nghia, T. T. A.
    Pham, T. S.
    MATHEMATICAL PROGRAMMING, 2018, 168 (1-2) : 313 - 346
  • [27] Lyapunov exponent estimates of a class of higher-order stochastic Anderson models
    Tang, Dan
    Bo, Lijun
    PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY, 2008, 136 (11) : 4033 - 4043
  • [28] Higher-order tangent derivative and its applications to sensitivity analysis
    Nguyen Le Hoang Anh
    Nguyen Manh Truong Giang
    Nguyen Vy Thong
    OPTIMIZATION LETTERS, 2022, 16 (06) : 1701 - 1724
  • [29] Rigorous Analysis of Idealised Pathfinding Ants in Higher-Order Logic
    Maggesi, Marco
    Brogi, Cosimo Perini
    LEVERAGING APPLICATIONS OF FORMAL METHODS, VERIFICATION AND VALIDATION: RIGOROUS ENGINEERING OF COLLECTIVE ADAPTIVE SYSTEMS, PT II, ISOLA 2024, 2025, 15220 : 297 - 315
  • [30] Efficient Two-Step Fifth-Order and Its Higher-Order Algorithms for Solving Nonlinear Systems with Applications
    Sivakumar, Parimala
    Jayaraman, Jayakumar
    AXIOMS, 2019, 8 (02)