Sharper Rates for Separable Minimax and Finite Sum Optimization via Primal-Dual Extragradient Methods

被引:0
作者
Jin, Yujia
Sidford, Aaron
Tian, Kevin
机构
来源
CONFERENCE ON LEARNING THEORY, VOL 178 | 2022年 / 178卷
关键词
convex optimization; first-order methods; stochastic optimization; minimax optimization; acceleration; VARIATIONAL-INEQUALITIES; 1ST-ORDER METHODS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We design accelerated algorithms with improved rates for several fundamental classes of optimization problems. Our algorithms all build upon techniques related to the analysis of primal-dual extragradient methods via relative Lipschitzness proposed recently by Cohen et al. (2021). (1) Separable minimax optimization. We study separable minimax optimization problems of the form min(x) max(y) f(x) g(y) + h(x; y), where f and g have smoothness and strong convexity parameters (L-x; mu(x)), (L-y; mu(y)), and h is convex-concave with a (Lambda(xx); Lambda(xy), Lambda(yy))-blockwise operator norm bounded Hessian. We provide an algorithm using (O) over tilde (root L-x/mu(x) + root L-y/mu(y) + Lambda(xx)mu(x) + Lambda(xy)/root mu(x)mu(y) + Lambda(yy)/mu(y)) gradient queries. Notably, for convex-concave minimax problems with bilinear coupling (e.g. quadratics), where Lambda(xx) = Lambda(yy) = 0, our rate matches a lower bound of Zhang et al. (2019). (2) Finite sum optimization. We study finite sum optimization problems of the form min(x) 1/n Sigma(i is an element of[n]) f(i)(x), where each fi is Li -smooth and the overall problem is mu-strongly convex. We provide an algorithm using (O) over tilde (n + Sigma(i is an element of[n]) root L-i/n mu) gradient queries. Notably, when the smoothness bounds {L-i}(i is an element of[n]) are non-uniform, our rate improves upon accelerated SVRG (Lin et al., 2015; Frostig et al., 2015) and Katyusha (Allen-Zhu, 2017) by up to a p n factor. (3) Minimax finite sums. We generalize our algorithms for minimax and finite sum optimization to solve a natural family of minimax finite sum optimization problems at an accelerated rate, encapsulating both above results up to a logarithmic root n factor.
引用
收藏
页数:54
相关论文
empty
未找到相关数据