Last-Iterate Convergence of Optimistic Gradient Method for Monotone Variational Inequalities

被引:0
|
作者
Gorbunov, Eduard [1 ,2 ,3 ,4 ]
Taylor, Adrien [5 ,6 ,7 ]
机构
[1] MIPT, Dolgoprudnyi, Russia
[2] Mila, Montreal, PQ, Canada
[3] UdeM, Montreal, PQ, Canada
[4] MBZUAI, Abu Dhabi, U Arab Emirates
[5] INRIA, CNRS, Paris, France
[6] Ecole Normale Super, CNRS, Paris, France
[7] PSL Res Univ, Paris, France
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022 | 2022年
基金
欧洲研究理事会;
关键词
WORST-CASE PERFORMANCE; 1ST-ORDER METHODS; CONVEX; OPTIMIZATION; SEARCH; TIGHT;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Past Extragradient (PEG) [Popov, 1980] method, also known as the Optimistic Gradient method, has known a recent gain in interest in the optimization community with the emergence of variational inequality formulations for machine learning. Recently, in the unconstrained case, Golowich et al. [2020a] proved that a O(1/N) last-iterate convergence rate in terms of the squared norm of the operator can be achieved for Lipschitz and monotone operators with a Lipschitz Jacobian. In this work, by introducing a novel analysis through potential functions, we show that (i) this O(1/N) last-iterate convergence can be achieved without any assumption on the Jacobian of the operator, and (ii) it can be extended to the constrained case, which was not derived before even under Lipschitzness of the Jacobian. The proof is significantly different from the one known from Golowich et al. [2020a], and its discovery was computer-aided. Those results close the open question of the last iterate convergence of PEG for monotone variational inequalities.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] CONVERGENCE RATE OF A GRADIENT PROJECTION METHOD FOR SOLVING VARIATIONAL INEQUALITIES
    Pham Duy Khanh
    Le Van Vinh
    Phan Tu Vuong
    JOURNAL OF NONLINEAR AND VARIATIONAL ANALYSIS, 2021, 5 (06): : 951 - 964
  • [22] Optimistic Dual Extrapolation for Coherent Non-monotone Variational Inequalities
    Song, Chaobing
    Zhou, Zhengyuan
    Zhou, Yichao
    Jiang, Yong
    Ma, Yi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [23] Strong convergence of a hybrid method for monotone variational inequalities and fixed point problems
    Yonghong Yao
    Yeong-Cheng Liou
    Mu-Ming Wong
    Jen-Chih Yao
    Fixed Point Theory and Applications, 2011
  • [24] Some convergence properties of a method of multipliers for linearly constrained monotone variational inequalities
    He, BS
    Yang, H
    OPERATIONS RESEARCH LETTERS, 1998, 23 (3-5) : 151 - 161
  • [25] On the Weak Convergence of the Extragradient Method for Solving Pseudo-Monotone Variational Inequalities
    Phan Tu Vuong
    Journal of Optimization Theory and Applications, 2018, 176 : 399 - 409
  • [26] Strong convergence of a hybrid method for monotone variational inequalities and fixed point problems
    Yao, Yonghong
    Liou, Yeong-Cheng
    Wong, Mu-Ming
    Yao, Jen-Chih
    FIXED POINT THEORY AND APPLICATIONS, 2011,
  • [27] On the Weak Convergence of the Extragradient Method for Solving Pseudo-Monotone Variational Inequalities
    Phan Tu Vuong
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2018, 176 (02) : 399 - 409
  • [28] ON THE CONVERGENCE OF DESCENT METHODS FOR MONOTONE VARIATIONAL-INEQUALITIES
    PATRIKSSON, M
    OPERATIONS RESEARCH LETTERS, 1994, 16 (05) : 265 - 269
  • [29] Continuation method for monotone variational inequalities
    Chen, B.
    Harker, P.T.
    Mathematical Programming, Series A, 1995, 69 (02):
  • [30] A Projected Extrapolated Gradient Method with Larger Step Size for Monotone Variational Inequalities
    Xiaokai Chang
    Jianchao Bai
    Journal of Optimization Theory and Applications, 2021, 190 : 602 - 627