Stochastic Smoothed Gradient Descent Ascent for Federated Minimax Optimization

被引:0
|
作者
Shen, Wei [1 ]
Huang, Minhui [2 ]
Zhang, Jiawei [3 ]
Shen, Cong [1 ]
机构
[1] Univ Virginia, Charlottesville, VA 22904 USA
[2] Meta AI, New York, NY USA
[3] MIT, 77 Massachusetts Ave, Cambridge, MA 02139 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, federated minimax optimization has attracted growing interest due to its extensive applications in various machine learning tasks. While Smoothed Alternative Gradient Descent Ascent (Smoothed-AGDA) has proved successful in centralized nonconvex minimax optimization, how and whether smoothing techniques could be helpful in a federated setting remains unexplored. In this paper, we propose a new algorithm termed Federated Stochastic Smoothed Gradient Descent Ascent (FESS-GDA), which utilizes the smoothing technique for federated minimax optimization. We prove that FESS-GDA can be uniformly applied to solve several classes of federated minimax problems and prove new or better analytical convergence results for these settings. We showcase the practical efficiency of FESS-GDA in practical federated learning tasks of training generative adversarial networks (GANs) and fair classification.
引用
收藏
页数:44
相关论文
共 50 条
  • [41] Bound Analysis of Natural Gradient Descent in Stochastic Optimization Setting
    Luo, Zhijian
    Liao, Danping
    Qian, Yuntao
    2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 4166 - 4171
  • [42] An Efficient Preconditioner for Stochastic Gradient Descent Optimization of Image Registration
    Qiao, Yuchuan
    Lelieveldt, Boudewijn P. F.
    Staring, Marius
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2019, 38 (10) : 2314 - 2325
  • [43] The combination of particle swarm optimization and stochastic gradient descent with momentum
    Chen, Chi-Hua
    ASIA-PACIFIC JOURNAL OF CLINICAL ONCOLOGY, 2022, 18 : 132 - 132
  • [44] Stability and optimization error of stochastic gradient descent for pairwise learning
    Shen, Wei
    Yang, Zhenhuan
    Ying, Yiming
    Yuan, Xiaoming
    ANALYSIS AND APPLICATIONS, 2020, 18 (05) : 887 - 927
  • [45] Stochastic gradient descent for hybrid quantum-classical optimization
    Sweke, Ryan
    Wilde, Frederik
    Meyer, Johannes Jakob
    Schuld, Maria
    Faehrmann, Paul K.
    Meynard-Piganeau, Barthelemy
    Eisert, Jens
    QUANTUM, 2020, 4
  • [46] CONTROLLING STOCHASTIC GRADIENT DESCENT USING STOCHASTIC APPROXIMATION FOR ROBUST DISTRIBUTED OPTIMIZATION
    Jain, Adit
    Krishnamurthy, Vikram
    NUMERICAL ALGEBRA CONTROL AND OPTIMIZATION, 2025, 15 (01): : 173 - 195
  • [47] Adaptive Sampling for Incremental Optimization Using Stochastic Gradient Descent
    Papa, Guillaume
    Bianchi, Pascal
    Clemencon, Stephan
    ALGORITHMIC LEARNING THEORY, ALT 2015, 2015, 9355 : 317 - 331
  • [48] A Novel Distributed Variant of Stochastic Gradient Descent and Its Optimization
    Wang, Yi-qi
    Zhao, Ya-wei
    Shi, Zhan
    Yin, Jian-ping
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER SCIENCE (AICS 2016), 2016, : 486 - 492
  • [49] Evolutionary Stochastic Gradient Descent for Optimization of Deep Neural Networks
    Cui, Xiaodong
    Zhang, Wei
    Tuske, Zoltan
    Picheny, Michael
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [50] Risk optimization using the Chernoff bound and stochastic gradient descent
    Carlon, Andre Gustavo
    Kroetz, Henrique Machado
    Torii, Andre Jacomel
    Lopez, Rafael Holdorf
    Fadel Miguel, Leandro Fleck
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2022, 223