On the Analysis of Variance-reduced and Randomized Projection Variants of Single Projection Schemes for Monotone Stochastic Variational Inequality Problems

被引:13
作者
Cui, Shisheng [1 ]
Shanbhag, Uday V. [1 ]
机构
[1] Penn State Univ, Dept Ind & Mfg Engn, University Pk, PA 16802 USA
基金
美国国家科学基金会;
关键词
Variational inequality problems; Stochastic optimization; Extragradient schemes; Stochastic approximation; APPROXIMATION METHODS; EXTRAGRADIENT METHOD; REGULARIZATION; SMOOTH;
D O I
10.1007/s11228-021-00572-6
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Classical extragradient schemes and their stochastic counterpart represent a cornerstone for resolving monotone variational inequality problems. Yet, such schemes have a per-iteration complexity of two projections onto a convex set and require two evaluations of the map, the former of which could be relatively expensive. We consider two related avenues where the per-iteration complexity is significantly reduced: (i) A stochastic projected reflected gradient method requiring a single evaluation of the map and a single projection; and (ii) A stochastic subgradient extragradient method that requires two evaluations of the map, a single projection onto the associated feasibility set, and a significantly cheaper projection (onto a halfspace) computable in closed form. Under a variance-reduced framework reliant on a sample-average of the map based on an increasing batch-size, we prove almost sure convergence of the iterates to a random point in the solution set for both schemes. Additionally, non-asymptotic rate guarantees are derived for both schemes in terms of the gap function; notably, both rates match the best-known rates obtained in deterministic regimes. To address feasibility sets given by the intersection of a large number of convex constraints, we adapt both of the aforementioned schemes to a random projection framework. We then show that the random projection analogs of both schemes also display almost sure convergence under a weak-sharpness requirement; furthermore, without imposing the weak-sharpness requirement, both schemes are characterized by the optimal rate in terms of the gap function of the projection of the averaged sequence onto the set as well as the infeasibility of this sequence. Preliminary numerics support theoretical findings and the schemes outperform standard extragradient schemes in terms of the per-iteration complexity.
引用
收藏
页码:453 / 499
页数:47
相关论文
共 46 条
  • [1] [Anonymous], 2014, Lectures on stochastic programming: modeling and theory
  • [2] ANTIPIN AS, 1978, MATEKON, V14, P23
  • [3] Bauschke HH, 2011, CMS BOOKS MATH, P1, DOI 10.1007/978-1-4419-9467-7
  • [4] Temporal Difference Methods for General Projected Equations
    Bertsekas, Dimitri P.
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2011, 56 (09) : 2128 - 2139
  • [5] Incremental proximal methods for large scale convex optimization
    Bertsekas, Dimitri P.
    [J]. MATHEMATICAL PROGRAMMING, 2011, 129 (02) : 163 - 195
  • [6] Birge JR, 2011, SPRINGER SER OPER RE, P3, DOI 10.1007/978-1-4614-0237-4
  • [7] The Subgradient Extragradient Method for Solving Variational Inequalities in Hilbert Space
    Censor, Y.
    Gibali, A.
    Reich, S.
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2011, 148 (02) : 318 - 335
  • [8] STOCHASTIC VARIATIONAL INEQUALITIES: RESIDUAL MINIMIZATION SMOOTHING SAMPLE AVERAGE APPROXIMATIONS
    Chen, Xiaojun
    Wets, Roger J-B
    Zhang, Yanfang
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2012, 22 (02) : 649 - 673
  • [9] Accelerated schemes for a class of variational inequalities
    Chen, Yunmei
    Lan, Guanghui
    Ouyang, Yuyuan
    [J]. MATHEMATICAL PROGRAMMING, 2017, 165 (01) : 113 - 149
  • [10] Cui SS, 2016, IEEE DECIS CONTR P, P4510, DOI 10.1109/CDC.2016.7798955