Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization

被引:136
作者
Shalev-Shwartz, Shai [1 ]
Zhang, Tong [2 ,3 ]
机构
[1] Hebrew Univ Jerusalem, Sch Comp Sci & Engn, Jerusalem, Israel
[2] Rutgers State Univ, Dept Stat, Piscataway, NJ 08854 USA
[3] Baidu Inc, Beijing, Peoples R China
关键词
OPTIMIZATION; ALGORITHMS; ONLINE;
D O I
10.1007/s10107-014-0839-0
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.
引用
收藏
页码:105 / 145
页数:41
相关论文
共 31 条
[1]  
[Anonymous], 2009, Advances in Neural Information Processing Systems
[2]  
[Anonymous], 2012, ARXIV12112717
[3]  
[Anonymous], 2012, P MACH LEARN RES
[4]  
[Anonymous], TECHNICAL REPORT
[5]  
[Anonymous], 2010, COLT
[6]  
[Anonymous], 2013, ICML
[7]  
[Anonymous], 2009, ADV NEURAL INFORM PR
[8]  
Baes M., 2009, ESTIMATE SEQUENCE ME
[9]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202
[10]  
Collins M, 2008, J MACH LEARN RES, V9, P1775