Rate of Convergence of the FOCUSS Algorithm

被引:6
作者
Xie, Kan [1 ]
He, Zhaoshui [2 ,3 ]
Cichocki, Andrzej [4 ,5 ,6 ]
Fang, Xiaozhao [7 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou 510006, Guangdong, Peoples R China
[2] Guangdong Univ Technol, Sch Automat, Guangdong Key Lab IoT Informat Proc, Guangzhou 510006, Guangdong, Peoples R China
[3] RIKEN Brain Sci Inst, Lab Adv Brain Signal Proc, Saitama 3510198, Japan
[4] RIKEN Brain Sci Inst, Lab Adv Brain Signal Proc, Saitama 3510198, Japan
[5] Polish Acad Sci, Inst Syst Res Sci, PL-00447 Warsaw, Poland
[6] Skolkovo Inst Sci & Technol, Moscow 143025, Russia
[7] Guangdong Univ Technol, Sch Comp, Guangzhou 510006, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Convergence; focal underdetermined system solver (FOCUSS) algorithm; linear convergence; rate of convergence; superlinear convergence; REWEIGHTED LEAST-SQUARES; SPARSE REPRESENTATION; RECONSTRUCTION;
D O I
10.1109/TNNLS.2016.2532358
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Focal underdetermined system solver (FOCUSS) is a powerful method for basis selection and sparse representation, where it employs the l(p)-norm with p is an element of (0, 2) to measure the sparsity of solutions. In this paper, we give a systematical analysis on the rate of convergence of the FOCUSS algorithm with respect to p. (0, 2). We prove that the FOCUSS algorithm converges superlinearly for 0 < p < 1 and linearly for 1 <= p < 2 usually, but may superlinearly in some very special scenarios. In addition, we verify its rates of convergence with respect to p by numerical experiments.
引用
收藏
页码:1276 / 1289
页数:14
相关论文
共 36 条
[1]  
[Anonymous], 1999, NUMERICAL OPTIMIZATI, DOI DOI 10.1007/B98874
[2]  
[Anonymous], P 8 IEEE DIG SIGN PR
[3]  
[Anonymous], 1987, Unconstrained Optimization: Practical Methods of Optimization
[4]  
Bach F., 2010, P EUR C MACH LEARN P
[5]   Convergence and Rate Analysis of Neural Networks for Sparse Approximation [J].
Balavoine, Aurele ;
Romberg, Justin ;
Rozell, Christopher J. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (09) :1377-1389
[6]   Atomic decomposition by basis pursuit [J].
Chen, SSB ;
Donoho, DL ;
Saunders, MA .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1998, 20 (01) :33-61
[7]  
Cichocki A., 2003, Adaptive Blind Signal and Image Processing
[8]   Iteratively Reweighted Least Squares Minimization for Sparse Recovery [J].
Daubechies, Ingrid ;
Devore, Ronald ;
Fornasier, Massimo ;
Guentuerk, C. Sinan .
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2010, 63 (01) :1-38
[9]   Optimally sparse representation in general (nonorthogonal) dictionaries via l1 minimization [J].
Donoho, DL ;
Elad, M .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2003, 100 (05) :2197-2202
[10]   Convergence of a Sparse Representations Algorithm Applicable to Real or Complex Data [J].
Fuchs, Jean Jacques .
IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2007, 1 (04) :598-605