共 50 条
On Biased Stochastic Gradient Estimation
被引:0
|作者:
Driggs, Derek
[1
]
Liang, Jingwei
[2
,3
]
Schonlieb, Carola-Bibiane
[1
]
机构:
[1] Univ Cambridge, Dept Appl Math & Theoret Phys, Cambridge CB3 0WA, England
[2] Shanghai Jiao Tong Univ, Inst Nat Sci, Shanghai 200240, Peoples R China
[3] Shanghai Jiao Tong Univ, Sch Math Sci, Shanghai 200240, Peoples R China
基金:
英国工程与自然科学研究理事会;
欧盟地平线“2020”;
关键词:
stochastic gradient descent;
variance reduction;
biased gradient estimation;
OPTIMIZATION;
ALGORITHM;
D O I:
暂无
中图分类号:
TP [自动化技术、计算机技术];
学科分类号:
0812 ;
摘要:
We present a uniform analysis of biased stochastic gradient methods for minimizing convex, strongly convex, and non-convex composite objectives, and identify settings where bias is useful in stochastic gradient estimation. The framework we present allows us to extend proximal support to biased algorithms, including SAG and SARAH, for the first time in the convex setting. We also use our framework to develop a new algorithm, Stochastic Average Recursive GradiEnt (SARGE), that achieves the oracle complexity lower-bound for nonconvex, finite-sum objectives and requires strictly fewer calls to a stochastic gradient oracle per iteration than SVRG and SARAH. We support our theoretical results with numerical experiments that demonstrate the benefits of certain biased gradient estimators.
引用
收藏
页数:43
相关论文