Efficient Mirror Descent Ascent Methods for Nonsmooth Minimax Problems

被引:0
|
作者
Huang, Feihu [1 ]
Wu, Xidong [1 ]
Huang, Heng [1 ]
机构
[1] Univ Pittsburgh, Dept Elect & Comp Engn, Pittsburgh, PA 15260 USA
关键词
GRADIENT;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the paper, we propose a class of efficient mirror descent ascent methods to solve the nonsmooth nonconvex-strongly-concave minimax problems by using dynamic mirror functions, and introduce a convergence analysis framework to conduct rigorous theoretical analysis for our mirror descent ascent methods. For our stochastic algorithms, we first prove that the mini-batch stochastic mirror descent ascent (SMDA) method obtains a gradient complexity of O(kappa(3) epsilon(-4)) for finding an epsilon-stationary point, where kappa denotes the condition number. Further, we propose an accelerated stochastic mirror descent ascent (VR-SMDA) method based on the variance reduced technique. We prove that our VR-SMDA method achieves a lower gradient complexity of O(kappa(3)epsilon(-3)). For our deterministic algorithm, we prove that our deterministic mirror descent ascent (MDA) achieves a lower gradient complexity of O(root kappa C-2) under mild conditions, which matches the best known complexity in solving smooth nonconvex-strongly-concave minimax optimization. We conduct the experiments on fair classifier and robust neural network training tasks to demonstrate the efficiency of our new algorithms.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Gradient Descent Ascent for Minimax Problems on Riemannian Manifolds
    Huang, Feihu
    Gao, Shangqian
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (07) : 8466 - 8476
  • [2] AdaGDA: Faster Adaptive Gradient Descent Ascent Methods for Minimax Optimization
    Huang, Feihu
    Wu, Xidong
    Hu, Zhengmian
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206
  • [3] STOCHASTIC BLOCK MIRROR DESCENT METHODS FOR NONSMOOTH AND STOCHASTIC OPTIMIZATION
    Dang, Cong D.
    Lan, Guanghui
    SIAM JOURNAL ON OPTIMIZATION, 2015, 25 (02) : 856 - 881
  • [4] GRAND: A Gradient-Related Ascent and Descent Algorithmic Framework for Minimax Problems
    Niu, Xiaochun
    Wei, Ermin
    2022 58TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2022,
  • [5] Stochastic Recursive Gradient Descent Ascent for Stochastic Nonconvex-Strongly-Concave Minimax Problems
    Luo, Luo
    Ye, Haishan
    Huang, Zhichao
    Zhang, Tong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [6] Descent gradient methods for nonsmooth minimization problems in ill-posed problems
    Pham Quy Muoi
    Dinh Nho Hao
    Maass, Peter
    Pidcock, Michael
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2016, 298 : 105 - 122
  • [7] Stochastic Smoothed Gradient Descent Ascent for Federated Minimax Optimization
    Shen, Wei
    Huang, Minhui
    Zhang, Jiawei
    Shen, Cong
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [8] Ascent, descent and additive preserving problems
    Oudghiri, Mourad
    Souilah, Khalid
    STUDIA UNIVERSITATIS BABES-BOLYAI MATHEMATICA, 2019, 64 (04): : 565 - 580
  • [9] Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient Methods
    Beznosikov, Aleksandr
    Gorbunov, Eduard
    Berard, Hugo
    Loizou, Nicolas
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206 : 172 - 235
  • [10] Zeroth-Order Alternating Gradient Descent Ascent Algorithms for A Class of Nonconvex-Nonconcave Minimax Problems
    Xu, Zi
    Wang, Zi-Qi
    Wang, Jun-Lin
    Dai, Yu-Hong
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24