Variance-Based Single-Call Proximal Extragradient Algorithms for Stochastic Mixed Variational Inequalities

被引:9
作者
Yang, Zhen-Ping [1 ]
Lin, Gui-Hua [2 ]
机构
[1] Jiaying Univ, Sch Math, Meizhou 514015, Peoples R China
[2] Shanghai Univ, Sch Management, Shanghai 200444, Peoples R China
关键词
Stochastic mixed variational inequality; Single-call; Proximal extragradient algorithm; Variance reduction; APPROXIMATION METHODS; LINE SEARCH; SCHEMES;
D O I
10.1007/s10957-021-01882-3
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In the study of stochastic variational inequalities, the extragradient algorithms attract much attention. However, such schemes require two evaluations of the expected mapping at each iteration in general. In this paper, we present several variance-based single-call proximal extragradient algorithms for solving a class of stochastic mixed variational inequalities by aiming at alleviating the cost of an extragradient step. One salient feature of the proposed algorithms is that they require only one evaluation of the expected mapping at each iteration, and hence, the computation load may be significantly reduced. We show that the proposed algorithms can achieve sublinear ergodic convergence rate in terms of the restricted merit function. Furthermore, under the strongly Minty variational inequality condition, we derive some results related to convergence rate of the distance between iterates and solutions, the iteration and oracle complexities for the proposed algorithms when the sample size increases at a geometric or polynomial rate. Numerical experiments indicate that the proposed algorithms are quite competitive with some existing algorithms.
引用
收藏
页码:393 / 427
页数:35
相关论文
共 46 条