共 31 条
Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
被引:136
作者:
Shalev-Shwartz, Shai
[1
]
Zhang, Tong
[2
,3
]
机构:
[1] Hebrew Univ Jerusalem, Sch Comp Sci & Engn, Jerusalem, Israel
[2] Rutgers State Univ, Dept Stat, Piscataway, NJ 08854 USA
[3] Baidu Inc, Beijing, Peoples R China
关键词:
OPTIMIZATION;
ALGORITHMS;
ONLINE;
D O I:
10.1007/s10107-014-0839-0
中图分类号:
TP31 [计算机软件];
学科分类号:
081202 ;
0835 ;
摘要:
We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.
引用
收藏
页码:105 / 145
页数:41
相关论文