A graph decomposition-based approach for the graph-fused lasso

被引:1
作者
Yu, Feng [1 ]
Yang, Archer Yi [2 ]
Zhang, Teng [3 ]
机构
[1] Univ Texas El Paso, 500 W Univ Ave, El Paso, TX 79968 USA
[2] McGill Univ, 805 Sherbrooke St West, Montreal, PQ H3A 0B9, Canada
[3] Univ Cent Florida, 4000 Cent Florida Blvd, Orlando, FL 32816 USA
基金
美国国家科学基金会;
关键词
ADMM; Graph-fused lasso; Nonsmooth convex optimization; Total variation minimization; Graph decomposition; RACHFORD SPLITTING METHOD; ALGORITHM; ADMM; CONVERGENCE;
D O I
10.1016/j.jspi.2024.106221
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We propose a new algorithm for solving the graph-fused lasso (GFL), a regularized model that operates under the assumption that the signal tends to be locally constant over a predefined graph structure. The proposed method applies a novel decomposition of the objective function for the alternating direction method of multipliers (ADMM) algorithm. While ADMM has been widely used in fused lasso problems, existing works such as the network lasso decompose the objective function into the loss function component and the total variation penalty component. In contrast, based on the graph matching technique in graph theory, we propose a new method of decomposition that separates the objective function into two components, where one component is the loss function plus part of the total variation penalty, and the other component is the remaining total variation penalty. We develop an exact convergence rate of the proposed algorithm by developing a general theory on the local convergence of ADMM. Compared with the network lasso algorithm, our algorithm has a faster exact linear convergence rate (although in the same order as for the network lasso). It also enjoys a smaller computational cost per iteration, thus converges overall faster in most numerical examples.
引用
收藏
页数:16
相关论文
共 41 条
[1]  
[Anonymous], 2010, Proc. Adv. Neural Inf. Process. Syst.
[2]  
[Anonymous], 2010, Practical ophthalmology, DOI DOI 10.1145/1835804.1835847
[3]   Efficient Implementations of the Generalized Lasso Dual Path Algorithm [J].
Arnold, Taylor B. ;
Tibshirani, Ryan J. .
JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2016, 25 (01) :1-27
[4]  
Barbero A, 2018, J MACH LEARN RES, V19
[5]   Spectral Sparsification of Graphs: Theory and Algorithms [J].
Batson, Joshua ;
Spielman, Daniel A. ;
Srivastava, Nikhil ;
Teng, Shang-Hua .
COMMUNICATIONS OF THE ACM, 2013, 56 (08) :87-94
[6]  
Benning M., 2016, IFIP C SYST MOD OPT, P117
[7]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[8]   On Total Variation Minimization and Surface Evolution Using Parametric Maximum Flows [J].
Chambolle, Antonin ;
Darbon, Jerome .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2009, 84 (03) :288-307
[9]   SMOOTHING PROXIMAL GRADIENT METHOD FOR GENERAL STRUCTURED SPARSE REGRESSION [J].
Chen, Xi ;
Lin, Qihang ;
Kim, Seyoung ;
Carbonell, Jaime G. ;
Xing, Eric P. .
ANNALS OF APPLIED STATISTICS, 2012, 6 (02) :719-752
[10]   Total variation, adaptive total variation and nonconvex smoothly clipped absolute deviation penalty for denoising blocky images [J].
Chopra, Aditya ;
Lian, Heng .
PATTERN RECOGNITION, 2010, 43 (08) :2609-2619