On Fundamental Limits of Joint Sparse Support Recovery Using Certain Correlation Priors

被引:26
作者
Koochakzadeh, Ali [1 ]
Qiao, Heng [2 ]
Pal, Piya [3 ]
机构
[1] Univ Calif San Diego, Signal Proc, La Jolla, CA 92093 USA
[2] Univ Calif San Diego, Dept Elect & Comp Engn, La Jolla, CA 92093 USA
[3] Univ Calif San Diego, Elect & Comp Engn, La Jolla, CA 92093 USA
基金
美国国家科学基金会;
关键词
Support recovery; Khatri-Rao product; underdetermined estimation; sparse samplers; Chernoff bound; SIGNAL RECONSTRUCTION; SUFFICIENT CONDITIONS; THEORETIC LIMITS; REPRESENTATIONS; PERFORMANCE; ALGORITHMS; ARRAYS; RANK;
D O I
10.1109/TSP.2018.2858211
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper provides new probabilistic guarantees for recovering the common support of jointly sparse vectors in multiple measurement vector (MMV) models. In recent times, Bayesian approaches for sparse signal recovery (such as sparse Bayesian learning and correlation-aware LASSO) have shown preliminary evidence that under appropriate conditions (such as access to ideal covariance matrix of the measurements or certain restrictive orthogonality condition on the signals), it is possible to recover supports of size (K) larger than the dimension (M) of each measurement vector. However, no results exist that characterize the probability with which this can be achieved for a finite number of measurement vectors (L). This paper bridges this gap by formulating the support recovery problem in terms of a multiple hypothesis testing framework. Chernoff-type upper bounds on the probability of error are established, and new sufficient conditions are derived that guarantee its exponential decay with respect to L even when K = O(M-2). Our sufficient conditions are based on the properties of the so-called Khatri-Rao product of the measurement matrix and reveal the importance of a sampler design. Negative results are also established indicating that when K exceeds a certain threshold (in terms of M), there will exist a class of measurement matrices for which any support recovery algorithm will fail. Using results from the geometric probability, we characterize the probabilitywith which a randomly generated measurement matrix will belong to this class and show that this probability tends to 1 asymptotically in the size (N) of the sparse vectors.
引用
收藏
页码:4612 / 4625
页数:14
相关论文
共 54 条
  • [1] Information Theoretic Bounds for Compressed Sensing
    Aeron, Shuchin
    Saligrama, Venkatesh
    Zhao, Manqi
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 56 (10) : 5111 - 5130
  • [2] Shannon-Theoretic Limits on Noisy Compressive Sampling
    Akcakaya, Mehmet
    Tarokh, Vahid
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 56 (01) : 492 - 504
  • [3] Sparse Signal Processing With Linear and Nonlinear Observations: A Unified Shannon-Theoretic Approach
    Aksoylar, Cem
    Atia, George K.
    Saligrama, Venkatesh
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (02) : 749 - 776
  • [4] [Anonymous], 2011, ADV NEURAL INFORM PR
  • [5] [Anonymous], 2008, Advances in Neural Information Processing Systems
  • [6] Localization of more sources than sensors via jointly-sparse bayesian learning
    Balkan, Ozgur
    Kreutz-Delgado, Kenneth
    Makeig, Scott
    [J]. IEEE Signal Processing Letters, 2014, 21 (02) : 131 - 134
  • [7] Greedy Algorithms for Joint Sparse Recovery
    Blanchard, Jeffrey D.
    Cermak, Michael
    Hanle, David
    Jing, Yirong
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2014, 62 (07) : 1694 - 1704
  • [8] Candes E. J., 2006, INT C MATH, V3, P1433, DOI DOI 10.4171/022-3/69
  • [9] Robust uncertainty principles:: Exact signal reconstruction from highly incomplete frequency information
    Candès, EJ
    Romberg, J
    Tao, T
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (02) : 489 - 509
  • [10] Decoding by linear programming
    Candes, EJ
    Tao, T
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (12) : 4203 - 4215