Faster Riemannian Newton-type optimization by subsampling and cubic regularization

被引:0
作者
Deng, Yian [1 ]
Mu, Tingting [1 ]
机构
[1] Univ Manchester, Dept Comp Sci, Manchester, England
关键词
Optimization; Cubic regularization; Riemannian manifolds; Subsampling; ALGORITHMS;
D O I
10.1007/s10994-023-06321-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This work is on constrained large-scale non-convex optimization where the constraint set implies a manifold structure. Solving such problems is important in a multitude of fundamental machine learning tasks. Recent advances on Riemannian optimization have enabled the convenient recovery of solutions by adapting unconstrained optimization algorithms over manifolds. However, it remains challenging to scale up and meanwhile maintain stable convergence rates and handle saddle points. We propose a new second-order Riemannian optimization algorithm, aiming at improving convergence rate and reducing computational cost. It enhances the Riemannian trust-region algorithm that explores curvature information to escape saddle points through a mixture of subsampling and cubic regularization techniques. We conduct rigorous analysis to study the convergence behavior of the proposed algorithm. We also perform extensive experiments to evaluate it based on two general machine learning tasks using multiple datasets. The proposed algorithm exhibits improved computational speed, e.g., a speed improvement from 12% to 227%, and improved convergence behavior, e.g., an iteration number reduction from O(max (epsilon(-2)(g) epsilon(-1)(H), epsilon(-3)(H)) ) to O(max (epsilon(-2)(g) , epsilon(-3)(H)) ), compared to a large set of state-of-the-art Riemannian optimization algorithms.
引用
收藏
页码:3527 / 3589
页数:63
相关论文
共 50 条
  • [1] Faster Riemannian Newton-type optimization by subsampling and cubic regularization
    Yian Deng
    Tingting Mu
    Machine Learning, 2023, 112 : 3527 - 3589
  • [2] Newton-type methods on Riemannian manifolds under Kantorovich-type conditions
    Amat, S.
    Argyros, I. K.
    Busquier, S.
    Castro, R.
    Hilout, S.
    Plaza, S.
    APPLIED MATHEMATICS AND COMPUTATION, 2014, 227 : 762 - 787
  • [3] Newton-Type Alternating Minimization Algorithm for Convex Optimization
    Stella, Lorenzo
    Themelis, Andreas
    Patrinos, Panagiotis
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2019, 64 (02) : 697 - 711
  • [4] Newton-type methods for simultaneous matrix diagonalization
    Khouja, Rima
    Mourrain, Bernard
    Yakoubsohn, Jean-Claude
    CALCOLO, 2022, 59 (04)
  • [5] Newton-Type Methods: A Broader View
    Izmailov, A. F.
    Solodov, M. V.
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2015, 164 (02) : 577 - 620
  • [6] Riemannian Stochastic Variance-Reduced Cubic Regularized Newton Method for Submanifold Optimization
    Zhang, Dewei
    Tajbakhsh, Sam Davanloo
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2023, 196 (01) : 324 - 361
  • [7] Riemannian Stochastic Variance-Reduced Cubic Regularized Newton Method for Submanifold Optimization
    Dewei Zhang
    Sam Davanloo Tajbakhsh
    Journal of Optimization Theory and Applications, 2023, 196 : 324 - 361
  • [8] Examples of dual behaviour of Newton-type methods on optimization problems with degenerate constraints
    Izmailov, A. F.
    Solodov, M. V.
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2009, 42 (02) : 231 - 264
  • [9] Manifold regularization based on Nystrom type subsampling
    Abhishake
    Sivananthan, S.
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2020, 49 (01) : 152 - 179
  • [10] Newton-type methods for non-convex optimization under inexact Hessian information
    Xu, Peng
    Roosta, Fred
    Mahoney, Michael W.
    MATHEMATICAL PROGRAMMING, 2020, 184 (1-2) : 35 - 70