Variance Reduced Random Relaxed Projection Method for Constrained Finite-Sum Minimization Problems

被引:2
|
作者
Yang, Zhichun [1 ]
Xia, Fu-quan [1 ]
Tu, Kai [2 ]
Yue, Man-Chung [3 ,4 ]
机构
[1] Sichuan Normal Univ, Sch Math Sci, Chengdu 610068, Peoples R China
[2] Shenzhen Univ, Sch Math Sci, Shenzhen 518060, Peoples R China
[3] Univ Hong Kong, Musketeers Fdn Inst Data Sci, Hong Kong, Peoples R China
[4] Univ Hong Kong, Dept Ind & Mfg Syst Engn, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Signal processing algorithms; Convergence; Approximation algorithms; Optimization; Convex functions; Indexes; Standards; Constrained optimization; finite-sum minimization; random projection method; relaxed projection; variance reduction; convergence rate analysis; ERROR-BOUNDS; OPTIMIZATION; CONVERGENCE; DESCENT;
D O I
10.1109/TSP.2024.3386388
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
For many applications in signal processing and machine learning, we are tasked with minimizing a large sum of convex functions subject to a large number of convex constraints. In this paper, we devise a new random projection method (RPM) to efficiently solve this problem. Compared with existing RPMs, our proposed algorithm features two useful algorithmic ideas. First, at each iteration, instead of projecting onto the set defined by one of the constraints, our algorithm only requires projecting onto a half-space approximation of the set, which significantly reduces the computational cost as it admits a closed-form formula. Second, to exploit the structure that the objective is a sum, variance reduction is incorporated into our algorithm to further improve the performance. As theoretical contributions, under a novel error bound condition and other standard assumptions, we prove that the proposed RPM converges to an optimal solution and that both optimality and feasibility gaps vanish at a sublinear rate. In particular, via a new analysis framework, we show that our RPM attains a faster convergence rate in optimality gap than existing RPMs when the objective function has a Lipschitz continuous gradient, capitalizing the benefit of the variance reduction. We also provide sufficient conditions for the error bound condition to hold. Experiments on a beamforming problem and a robust classification problem are presented to demonstrate the superiority of our RPM over existing ones.
引用
收藏
页码:2188 / 2203
页数:16
相关论文
共 50 条
  • [1] Variance-Reduced Shuffling Gradient Descent With Momentum for Finite-Sum Minimization
    Jiang, Xia
    Zeng, Xianlin
    Xi, Lihua
    Sun, Jian
    IEEE CONTROL SYSTEMS LETTERS, 2023, 7 : 1700 - 1705
  • [2] Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems
    Hanzely, Filip
    Koyaley, Dmitry
    Richtarik, Peter
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [3] Subsampled cubic regularization method for finite-sum minimization
    Goncalves, Max L. N.
    OPTIMIZATION, 2024,
  • [4] A stochastic variance-reduced accelerated primal-dual method for finite-sum saddle-point problems
    Erfan Yazdandoost Hamedani
    Afrooz Jalilzadeh
    Computational Optimization and Applications, 2023, 85 : 653 - 679
  • [5] A stochastic variance-reduced accelerated primal-dual method for finite-sum saddle-point problems
    Hamedani, Erfan Yazdandoost
    Jalilzadeh, Afrooz
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2023, 85 (02) : 653 - 679
  • [6] Adaptive Stochastic Variance Reduction for Non-convex Finite-Sum Minimization
    Kavis, Ali
    Skoulakis, Stratis
    Antonakopoulos, Kimon
    Dadi, Leello Tadesse
    Cevher, Volkan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [7] Finite-sum Composition Optimization via Variance Reduced Gradient Descent
    Lian, Xiangru
    Wang, Mengdi
    Liu, Ji
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 1159 - 1167
  • [8] Variance reduced moving balls approximation method for smooth constrained minimization problems
    Yang, Zhichun
    Xia, Fu-quan
    Tu, Kai
    OPTIMIZATION LETTERS, 2024, 18 (05) : 1253 - 1271
  • [9] Variance reduced moving balls approximation method for smooth constrained minimization problems
    Zhichun Yang
    Fu-quan Xia
    Kai Tu
    Optimization Letters, 2024, 18 : 1253 - 1271
  • [10] A New Random Reshuffling Method for Nonsmooth Nonconvex Finite-sum Optimization
    Qiu, Junwen
    Li, Xiao
    Milzarek, Andre
    arXiv, 2023,