Parallel Selective Algorithms for Nonconvex Big Data Optimization

被引:124
作者
Facchinei, Francisco [1 ]
Scutari, Gesualdo [2 ]
Sagratella, Simone [1 ]
机构
[1] Univ Roma La Sapienza, Dept Comp Control & Management Engn, I-00185 Rome, Italy
[2] SUNY Buffalo, Dept Elect Engn, Buffalo, NY 14260 USA
基金
美国国家科学基金会;
关键词
Parallel optimization; variables selection; distributed methods; Jacobi method; LASSO; sparse solution; DESCENT ALGORITHMS; MINIMIZATION; DECOMPOSITION; REGRESSION; SHRINKAGE;
D O I
10.1109/TSP.2015.2399858
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We propose a decomposition framework for the parallel optimization of the sum of a differentiable (possibly nonconvex) function and a (block) separable nonsmooth, convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. Our framework is very flexible and includes both fully parallel Jacobi schemes and Gauss-Seidel (i.e., sequential) ones, as well as virtually all possibilities "in between" with only a subset of variables updated at each iteration. Our theoretical convergence results improve on existing ones, and numerical results on LASSO, logistic regression, and some nonconvex quadratic problems show that the new method consistently outperforms existing algorithms.
引用
收藏
页码:1874 / 1889
页数:16
相关论文
共 41 条
[1]  
[Anonymous], 2012, ARXIV12120873
[2]  
[Anonymous], 28 INT C MACH LEARN
[3]  
[Anonymous], 26 INT C MACH LEARN
[4]  
[Anonymous], 1989, Parallel and Distributed Computation: Numerical Methods
[5]  
[Anonymous], ARXIV13093529
[6]  
Bach F., 2011, FDN TRENDS MACHINE L
[7]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202
[8]  
Bertsekas D. P., 2011, NEURODYNAMIC PROGRAM
[9]  
Bühlmann P, 2011, SPRINGER SER STAT, P1, DOI 10.1007/978-3-642-20192-9