Hybrid Random/Deterministic Parallel Algorithms for Convex and Nonconvex Big Data Optimization

被引:49
作者
Daneshmand, Amir [1 ]
Facchinei, Francisco [2 ]
Kungurtsev, Vyacheslav [3 ]
Scutari, Gesualdo [4 ]
机构
[1] SUNY Buffalo, Dept Elect Engn, Buffalo, NY 14228 USA
[2] Univ Roma La Sapienza, Dept Comp Control & Management Engneering, I-00185 Rome, Italy
[3] Czech Tech Univ, Fac Elect Engn, Dept Comp Sci, Agent Technol Ctr, Prague 16636, Czech Republic
[4] Purdue Univ, Sch Ind Engn, W Lafayette, IN 47907 USA
基金
美国国家科学基金会;
关键词
Jacobi method; nonconvex problems; parallel and distributed methods; random selections; sparse solution; COORDINATE DESCENT ALGORITHM; MINIMIZATION; CONVERGENCE; SHRINKAGE;
D O I
10.1109/TSP.2015.2436357
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We propose a decomposition framework for the parallel optimization of the sum of a differentiable (possibly nonconvex) function and a nonsmooth (possibly nonseparable), convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. The main contribution of this work is a novel parallel, hybrid random/deterministic decomposition scheme wherein, at each iteration, a subset of (block) variables is updated at the same time by minimizing a convex surrogate of the original nonconvex function. To tackle huge-scale problems, the (block) variables to be updated are chosen according to a mixed random and deterministic procedure, which captures the advantages of both pure deterministic and random update-based schemes. Almost sure convergence of the proposed scheme is established. Numerical results show that on huge-scale problems the proposed hybrid random/deterministic algorithm compares favorably to random and deterministic schemes on both convex and nonconvex problems.
引用
收藏
页码:3914 / 3929
页数:16
相关论文
共 55 条
[31]   Efficient parallel coordinate descent algorithm for convex optimization problems with separable constraints: Application to distributed MPC [J].
Necoara, Ion ;
Clipici, Dragos .
JOURNAL OF PROCESS CONTROL, 2013, 23 (03) :243-253
[32]   Gradient methods for minimizing composite functions [J].
Nesterov, Yu .
MATHEMATICAL PROGRAMMING, 2013, 140 (01) :125-161
[33]   EFFICIENCY OF COORDINATE DESCENT METHODS ON HUGE-SCALE OPTIMIZATION PROBLEMS [J].
Nesterov, Yu .
SIAM JOURNAL ON OPTIMIZATION, 2012, 22 (02) :341-362
[34]   Error bounds in mathematical programming [J].
Pang, JS .
MATHEMATICAL PROGRAMMING, 1997, 79 (1-3) :299-332
[35]  
Patrascu A., 2014, J GLOBAL OPTIM, P1
[36]   Cost approximation: A unified framework of descent algorithms for nonlinear programs [J].
Patriksson, M .
SIAM JOURNAL ON OPTIMIZATION, 1998, 8 (02) :561-582
[37]  
Peng ZM, 2013, CONF REC ASILOMAR C, P659, DOI 10.1109/ACSSC.2013.6810364
[38]   Efficient block-coordinate descent algorithms for the Group Lasso [J].
Qin Z. ;
Scheinberg K. ;
Goldfarb D. .
Qin, Z. (zq2107@columbia.edu), 2013, Springer Verlag (05) :143-169
[39]  
Razaviyayn M., 2014, ANN C NEUR INF PROC
[40]  
Razaviyayn M, 2014, INT CONF ACOUST SPEE