An accelerated proximal alternating direction method of multipliers for robust fused Lasso

被引:1
作者
Fan, Yibao [1 ]
Shang, Youlin [1 ]
Jin, Zheng-Fen [1 ,2 ]
Liu, Jia [1 ]
Zhang, Roxin [3 ]
机构
[1] Henan Univ Sci & Technol, Sch Math & Stat, Luolong, Peoples R China
[2] Beihang Univ, Sch Math Sci, LMIB Minist Educ, Beijing, Peoples R China
[3] Northern Michigan Univ, Dept Math & Comp Sci, Marquette, MI USA
基金
中国国家自然科学基金;
关键词
Robust fused Lasso; accelerated proximal alternating direction method of multipliers; nonergodic convergence; extension factor; REGRESSION SHRINKAGE; ALGORITHM; ESTIMATOR; SELECTION;
D O I
10.1051/ro/2023065
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In the era of big data, much of the data is susceptible to noise with heavy-tailed distribution. Fused Lasso can effectively handle high dimensional sparse data with strong correlation between two adjacent variables under known Gaussian noise. However, it has poor robustness to non-Gaussian noise with heavy-tailed distribution. Robust fused Lasso with l(1) norm loss function can overcome the drawback of fused Lasso when noise is heavy-tailed distribution. But the key challenge for solving this model is nonsmoothness and its nonseparability. Therefore, in this paper, we first deform the robust fused Lasso into an easily solvable form, which changes the three-block objective function to a two-block form. Then, we propose an accelerated proximal alternating direction method of multipliers (APADMM) with an additional update step, which is base on a new PADMM that changes the Lagrangian multiplier term update. Furthermore, we give the O(1/K) nonergodic convergence rate analysis of the proposed APADMM. Finally, numerical results show that the proposed new PADMM and APADMM have better performance than other existing ADMM solvers.
引用
收藏
页码:1219 / 1238
页数:20
相关论文
共 31 条
  • [1] Distributed optimization and statistical learning via the alternating direction method of multipliers
    Boyd S.
    Parikh N.
    Chu E.
    Peleato B.
    Eckstein J.
    [J]. Foundations and Trends in Machine Learning, 2010, 3 (01): : 1 - 122
  • [2] A proximal point algorithm revisit on the alternating direction method of multipliers
    Cai XingJu
    Gu GuoYong
    He BingSheng
    Yuan XiaoMing
    [J]. SCIENCE CHINA-MATHEMATICS, 2013, 56 (10) : 2179 - 2186
  • [3] The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent
    Chen, Caihua
    He, Bingsheng
    Ye, Yinyu
    Yuan, Xiaoming
    [J]. MATHEMATICAL PROGRAMMING, 2016, 155 (1-2) : 57 - 79
  • [4] DE-NOISING BY SOFT-THRESHOLDING
    DONOHO, DL
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 1995, 41 (03) : 613 - 627
  • [5] AN ALGORITHM FOR RESTRICTED LEAST-SQUARES REGRESSION
    DYKSTRA, RL
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1983, 78 (384) : 837 - 842
  • [6] Gabay D., 1976, Computers & Mathematics with Applications, V2, P17, DOI 10.1016/0898-1221(76)90003-1
  • [7] GLOWINSKI R, 1975, REV FR AUTOMAT INFOR, V9, P41
  • [8] Golub G.H., 2013, Matrix computations
  • [9] A Survey on Some Recent Developments of Alternating Direction Method of Multipliers
    Han, De-Ren
    [J]. JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2022, 10 (01) : 1 - 52
  • [10] Hastie T., 2009, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, DOI [10.1007/978-0-387-84858-7, 10.1007/BF02985802, DOI 10.1007/978-0-387-84858-7]