Tight Performance Bounds for Compressed Sensing With Conventional and Group Sparsity

被引:13
作者
Ranjan, Shashank [1 ]
Vidyasagar, Mathukumalli [1 ]
机构
[1] Indian Inst Technol Hyderabad, Dept Elect Engn, Kandi 502285, India
基金
美国国家科学基金会;
关键词
Compressed sensing; convex functions; optimization; RESTRICTED ISOMETRY PROPERTY; RECOVERY; SIGNALS; UNION;
D O I
10.1109/TSP.2019.2907228
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, we study the problem of recovering a group sparse vector from a small number of linear measurements. In the past, the common approach has been to use various "group sparsity-inducing" norms such as the Group LASSO norm for this purpose. By using the theory of convex relaxations, we show that it is also possible to use l(1)-norm minimization for group sparse recovery. We introduce a new concept called group robust null space property (GRNSP), and show that, under suitable conditions, a group version of the restricted isometry property (GRIP) implies the GRNSP, and thus leads to group sparse recovery. When all groups are of equal size, our bounds are sometimes less conservative than known bounds. Moreover, our results apply even to situations where the groups have different sizes. When specialized to conventional sparsity, our bounds reduce to one of the well-known "best possible" conditions for sparse recovery. This relationship between GRNSP and GRIP is new even for conventional sparsity, and substantially streamlines the proofs of some known results. Using this relationship, we derive bounds on the l(p)-norm of the residual error vector for all p is an element of [1, 2], and not just when p = 2. When the measurement matrix consists of random samples of a sub-Gaussian random variable, we present bounds on the number of measurements, which are sometimes less conservative than currently known bounds.
引用
收藏
页码:2854 / 2867
页数:14
相关论文
共 29 条
[1]  
[Anonymous], 2006, J ROYAL STAT SOC B
[2]  
[Anonymous], 2012, Proceedings of the 15th International Conference on Artificial Intelligence and Statistics (AISTATS)
[3]   Conditioning of Random Block Subdictionaries With Applications to Block-Sparse Recovery and Regression [J].
Bajwa, Waheed U. ;
Duarte, Marco F. ;
Calderbank, Robert .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2015, 61 (07) :4060-4079
[4]   Model-Based Compressive Sensing [J].
Baraniuk, Richard G. ;
Cevher, Volkan ;
Duarte, Marco F. ;
Hegde, Chinmay .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 56 (04) :1982-2001
[5]   Sparse Representation of a Polytope and Recovery of Sparse Signals and Low-Rank Matrices [J].
Cai, T. Tony ;
Zhang, Anru .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (01) :122-132
[6]   New Bounds for Restricted Isometry Constants [J].
Cai, T. Tony ;
Wang, Lie ;
Xu, Guangwu .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 56 (09) :4388-4394
[7]   Decoding by linear programming [J].
Candes, EJ ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (12) :4203-4215
[8]   The restricted isometry property and its implications for compressed sensing [J].
Candes, Emmanuel J. .
COMPTES RENDUS MATHEMATIQUE, 2008, 346 (9-10) :589-592
[9]  
Chen W.G., 2016, ARXIV161006294
[10]   Block-Sparse Signals: Uncertainty Relations and Efficient Recovery [J].
Eldar, Yonina C. ;
Kuppinger, Patrick ;
Boelcskei, Helmut .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (06) :3042-3054