Accelerated Smoothing Hard Thresholding Algorithms for l0 Regularized Nonsmooth Convex Regression Problem

被引:0
|
作者
Bian, Wei [1 ]
Wu, Fan [1 ]
机构
[1] Harbin Inst Technol, Sch Math, Harbin, Peoples R China
基金
中国国家自然科学基金;
关键词
Nonsmooth optimization; Smoothing method; Cardinality penalty; Accelerated algorithm with extrapolation; Convergence rate; PROXIMAL GRADIENT ALGORITHM; VARIABLE SELECTION; CONVERGENCE; SPARSE; MINIMIZATION; SHRINKAGE; EXTRAPOLATION;
D O I
10.1007/s10915-023-02249-8
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We study a class of constrained sparse optimization problems with cardinality penalty, where the feasible set is defined by box constraint, and the loss function is convex but not necessarily smooth. First, we propose an accelerated smoothing hard thresholding (ASHT) algorithm for solving such problems, which combines smoothing approximation, extrapolation technique and iterative hard thresholding method. The extrapolation coefficients can be chosen to satisfy sup(k) ss(k) = 1. We discuss the convergence of ASHT algorithm with different extrapolation coefficients, and give a sufficient condition to ensure that any accumulation point of the iterates is a local minimizer of the original problem. For a class of special updating schemes on the extrapolation coefficients, we obtain that the iterates are convergent to a local minimizer of the problem, and the convergence rate is o(ln(sigma) k/k) with sigma is an element of(1/2, 1] on the loss and objective function values. Second, we consider the case inwhich the loss function is Lipschitz continuously differentiable, and develop an accelerated hard thresholding (AHT) algorithm to solve it. We prove that the iterates of AHT algorithm converge to a local minimizer of the problem that satisfies a desirable lower bound property. Moreover, we show that the convergence rates of loss and objective function values are o(k(-2)). Finally, some numerical examples are presented to show the theoretical results.
引用
收藏
页数:30
相关论文
共 13 条