共 50 条
- [1] AdaGDA: Faster Adaptive Gradient Descent Ascent Methods for Minimax Optimization INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206
- [2] Stochastic Recursive Gradient Descent Ascent for Stochastic Nonconvex-Strongly-Concave Minimax Problems ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
- [3] Universal Gradient Descent Ascent Method for Nonconvex-Nonconcave Minimax Optimization ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [9] Randomized Stochastic Gradient Descent Ascent INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
- [10] Near-optimal Local Convergence of Alternating Gradient Descent-Ascent for Minimax Optimization INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151