Two Conditions for Equivalence of 0-Norm Solution and 1-Norm Solution in Sparse Representation

被引:10
作者
Li, Yuanqing [1 ]
Amari, Shun-ichi [2 ]
机构
[1] S China Univ Technol, Sch Automat Sci & Engn, Guangzhou 510640, Guangdong, Peoples R China
[2] RIKEN Brain Sci Inst, Lab Math Neurosci, Wako, Saitama 3510198, Japan
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2010年 / 21卷 / 07期
基金
中国国家自然科学基金;
关键词
0-norm solution; 1-norm solution; equivalence; probability; sparse representation; BLIND SOURCE SEPARATION; ATOMIC DECOMPOSITION; BASES; PROBABILITY;
D O I
10.1109/TNN.2010.2049370
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In sparse representation, two important sparse solutions, the 0-norm and 1-norm solutions, have been receiving much of attention. The 0-norm solution is the sparsest, however it is not easy to obtain. Although the 1-norm solution may not be the sparsest, it can be easily obtained by the linear programming method. In many cases, the 0-norm solution can be obtained through finding the 1-norm solution. Many discussions exist on the equivalence of the two sparse solutions. This paper analyzes two conditions for the equivalence of the two sparse solutions. The first condition is necessary and sufficient, however, difficult to verify. Although the second is necessary but is not sufficient, it is easy to verify. In this paper, we analyze the second condition within the stochastic framework and propose a variant. We then prove that the equivalence of the two sparse solutions holds with high probability under the variant of the second condition. Furthermore, in the limit case where the 0-norm solution is extremely sparse, the second condition is also a sufficient condition with probability 1.
引用
收藏
页码:1189 / 1196
页数:9
相关论文
共 27 条
[1]   Sparse model identification using a forward orthogonal regression algorithm aided by mutual information [J].
Billings, Stephen A. ;
Wei, Hua-Liang .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (01) :306-310
[2]   Underdetermined blind source separation using sparse representations [J].
Bofill, P ;
Zibulevsky, M .
SIGNAL PROCESSING, 2001, 81 (11) :2353-2362
[3]   Atomic decomposition by basis pursuit [J].
Chen, SSB ;
Donoho, DL ;
Saunders, MA .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1998, 20 (01) :33-61
[4]  
DELGADO KK, 2003, NEURAL COMPUT, V15, P349, DOI DOI 10.1162/089976603762552951
[5]  
Donoho D.L., 2005, 20054 STANF U STAT D
[6]   Optimally sparse representation in general (nonorthogonal) dictionaries via l1 minimization [J].
Donoho, DL ;
Elad, M .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2003, 100 (05) :2197-2202
[7]   Uncertainty principles and ideal atomic decomposition [J].
Donoho, DL ;
Huo, XM .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2001, 47 (07) :2845-2862
[8]  
DONOHO DL, 2004, 20043 STANF U STAT D
[9]   A generalized uncertainty principle and sparse representation in pairs of bases [J].
Elad, M ;
Bruckstein, AM .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2002, 48 (09) :2558-2567
[10]   On sparse representations in arbitrary redundant bases [J].
Fuchs, JJ .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2004, 50 (06) :1341-1344