Sparse Signal Processing With Linear and Nonlinear Observations: A Unified Shannon-Theoretic Approach

被引:20
作者
Aksoylar, Cem [1 ]
Atia, George K. [2 ]
Saligrama, Venkatesh [1 ]
机构
[1] Boston Univ, Dept Elect & Comp Engn, Boston, MA 02215 USA
[2] Univ Cent Florida, Dept Elect & Comp Engn, Orlando, FL 32816 USA
基金
美国国家科学基金会;
关键词
Support recovery; sample complexity; compressed sensing; group testing; mutual information; sparsity pattern recovery; information-theoretic limits; nonlinear models; 1-bit compressive sensing; SUFFICIENT CONDITIONS; SUPPORT RECOVERY; LIMITS; BOUNDS;
D O I
10.1109/TIT.2016.2605122
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We derive fundamental sample complexity bounds for recovering sparse and structured signals for linear and nonlinear observation models, including sparse regression, group testing, multivariate regression, and problems with missing features. In general, sparse signal processing problems can be characterized in terms of the following Markovian property. We are given a set of N variables X-1, X-2, ..., X-N, and there is an unknown subset of variables S subset of {1, ..., N} that are relevant for predicting outcomes Y. More specifically, when Y is conditioned on {X-n}(n is an element of S), it is conditionally independent of the other variables, {X-n}(n is not an element of S). Our goal is to identify the set S from samples of the variables X and the associated outcomes Y. We characterize this problem as a version of the noisy channel coding problem. Using asymptotic information theoretic analyses, we establish mutual information formulas that provide sufficient and necessary conditions on the number of samples required to successfully recover the salient variables. These mutual information expressions unify conditions for both linear and nonlinear observations. We then compute sample complexity bounds for the aforementioned models, based on the mutual information expressions in order to demonstrate the applicability and flexibility of our results in general sparse signal processing models.
引用
收藏
页码:749 / 776
页数:28
相关论文
共 46 条
  • [1] Information Theoretic Bounds for Compressed Sensing
    Aeron, Shuchin
    Saligrama, Venkatesh
    Zhao, Manqi
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 56 (10) : 5111 - 5130
  • [2] Shannon-Theoretic Limits on Noisy Compressive Sampling
    Akcakaya, Mehmet
    Tarokh, Vahid
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 56 (01) : 492 - 504
  • [3] Aksoylar C, 2012, 2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), P329, DOI 10.1109/SSP.2012.6319695
  • [4] Aksoylar C., 2013, P IEEE INF THEOR WOR, P1
  • [5] Aksoylar C, 2014, JMLR WORKSH CONF PRO, V33, P38
  • [6] Aksoylar C, 2014, IEEE INT SYMP INFO, P1311, DOI 10.1109/ISIT.2014.6875045
  • [7] Aksoylar C, 2013, INT CONF ACOUST SPEE, P5524, DOI 10.1109/ICASSP.2013.6638720
  • [8] [Anonymous], 2011, ADV NEURAL INFORM PR
  • [9] [Anonymous], 2013, P IEEE RAD C RADARCO, DOI DOI 10.1371/J0URNAL.P0NE.0080390)
  • [10] Atia G., 2007, BOOLEAN COMPRESSED S