Generalized Approximate Message-Passing for Compressed Sensing With Sublinear Sparsity

被引:0
作者
Takeuchi, Keigo [1 ]
机构
[1] Toyohashi Univ Technol, Dept Elect & Elect Informat Engn, Toyohashi 4418580, Japan
基金
日本学术振兴会;
关键词
Compressed sensing; exact signal reconstruction; sublinear sparsity; generalized approximate message-passing; state evolution; PHASE RETRIEVAL; SIGNAL RECOVERY; THRESHOLDING ALGORITHM; MULTIUSER DETECTION; STATE EVOLUTION; INFORMATION; REGRESSION; SHRINKAGE; DYNAMICS; LIMITS;
D O I
10.1109/TIT.2025.3560070
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper addresses the reconstruction of an unknown signal vector with sublinear sparsity from generalized linear measurements. Generalized approximate message-passing (GAMP) is proposed via state evolution in the sublinear sparsity limit, where the signal dimension N, measurement dimension M, and signal sparsity k satisfy log k/log N -> gamma is an element of [0, 1) and M/{k log(N/k)} -> delta as N and k tend to infinity. While the overall flow in state evolution is the same as that for linear sparsity, each proof step for inner denoising requires stronger assumptions than those for linear sparsity. The required new assumptions are proved for Bayesian inner denoising. When Bayesian outer and inner denoisers are used in GAMP, the obtained state evolution recursion is utilized to evaluate the prefactor delta in the sample complexity, called reconstruction threshold. If and only if delta is larger than the reconstruction threshold, Bayesian GAMP can achieve asymptotically exact signal reconstruction. In particular, the reconstruction threshold is finite for noisy linear measurements when the support of non-zero signal elements does not include a neighborhood of zero. As numerical examples, this paper considers linear measurements and 1-bit compressed sensing. Numerical simulations for both cases show that Bayesian GAMP outperforms existing algorithms for sublinear sparsity in terms of the sample complexity.
引用
收藏
页码:4602 / 4636
页数:35
相关论文
共 72 条
[1]   Information Theoretic Bounds for Compressed Sensing [J].
Aeron, Shuchin ;
Saligrama, Venkatesh ;
Zhao, Manqi .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 56 (10) :5111-5130
[2]   Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation [J].
Barbier, Jean ;
Macris, Nicolas ;
Dia, Mohamad ;
Krzakala, Florent .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2020, 66 (07) :4270-4303
[3]   Optimal errors and phase transitions in high-dimensional generalized linear models [J].
Barbier, Jean ;
Krzakala, Florent ;
Macris, Nicolas ;
Miolane, Leo ;
Zdeborova, Lenka .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2019, 116 (12) :5451-5460
[4]   The LASSO Risk for Gaussian Matrices [J].
Bayati, Mohsen ;
Montanari, Andrea .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2012, 58 (04) :1997-2017
[5]   The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing [J].
Bayati, Mohsen ;
Montanari, Andrea .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2011, 57 (02) :764-785
[6]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202
[7]   State evolution for approximate message passing with non-separable functions [J].
Berthier, Raphael ;
Montanari, Andrea ;
Phan-Minh Nguyen .
INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2020, 9 (01) :33-79
[8]   Iterative hard thresholding for compressed sensing [J].
Blumensath, Thomas ;
Davies, Mike E. .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2009, 27 (03) :265-274
[9]   An Iterative Construction of Solutions of the TAP Equations for the Sherrington-Kirkpatrick Model [J].
Bolthausen, Erwin .
COMMUNICATIONS IN MATHEMATICAL PHYSICS, 2014, 325 (01) :333-366
[10]   1-bit compressive sensing [J].
Boufounos, Petros T. ;
Baraniuk, Richard G. .
2008 42ND ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS, VOLS 1-3, 2008, :16-21