On Approximate Message Passing Algorithms for Unlimited Sampling of Sparse Signals

被引:2
作者
Musa, Osman [1 ,2 ]
Jung, Peter [1 ]
Caire, Giuseppe [1 ]
机构
[1] Tech Univ Berlin, Commun & Informat Theory, Berlin, Germany
[2] Tech Univ Berlin, BIFOLD, Berlin, Germany
来源
2023 IEEE 9TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING, CAMSAP | 2023年
关键词
Approximate message passing; Gaussian mixture; self-reset analog to digital converter; compressed sensing;
D O I
10.1109/CAMSAP58249.2023.10403491
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
In this paper we investigate different approximate message passing (AMP) algorithms for recovering sparse signals measured in a compressed unlimited sampling (US) framework. More specifically, besides our previous work on the generalized approximate message passing (GAMP) algorithm, in this work, using an alternative formulation of the US recovery problem we consider the Bayesian approximate message passing (BAMP) algorithm. Furthermore, we consider learned versions of the two algorithms based on modelling source prior with a Gaussianmixture (GM), which can well approximate continuous, discrete, as well as mixture distributions. Thus we propose the learned Gaussian mixture GAMP (L-GM-GAMP) and the learned Gaussian mixture AMP (L-GM-AMP) algorithms for the US recovery problem - two plug-and-play algorithms which learn the source distribution and the algorithms' tunable parameters in a supervised manner. To empirically show the effectiveness of the aforementioned algorithms we conduct Monte-Carlo (MC) simulations. The results show that the computationally more stable learned AMP (LAMP) requires slightly more measurements to reach the same accuracy as the GAMP algorithm. Additionally, we observe that within the US framework, the algorithms using the learning approach, namely L-GM-AMP and L-GM-GAMP, achieve the same accuracy and reduce the amount of required prior knowledge, at the expense of prior algorithm training.
引用
收藏
页码:131 / 135
页数:5
相关论文
共 50 条
[21]   DISTRIBUTED APPROXIMATE MESSAGE PASSING WITH SUMMATION PROPAGATION [J].
Hayakawa, Ryo ;
Nakai, Ayano ;
Hayashi, Kazunori .
2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, :4104-4108
[22]   An Expectation Propagation Perspective on Approximate Message Passing [J].
Meng, Xiangming ;
Wu, Sheng ;
Kuang, Linling ;
Lu, Jianhua .
IEEE SIGNAL PROCESSING LETTERS, 2015, 22 (08) :1194-1197
[23]   Speech Compressive Sampling Using Approximate Message Passing and a Markov Chain Prior [J].
Jia, Xiaoli ;
Liu, Peilin ;
Jiang, Sumxin .
SENSORS, 2020, 20 (16) :1-13
[24]   Approximate Message Passing Algorithms for Low Complexity OFDM-IM Detection [J].
Sui, Zeping ;
Yan, Shefeng ;
Zhang, Hongming ;
Yang, Lie-Liang ;
Hanzo, Lajos .
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2021, 70 (09) :9607-9612
[25]   State evolution for general approximate message passing algorithms, with applications to spatial coupling [J].
Javanmard, Adel ;
Montanari, Andrea .
INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2013, 2 (02) :115-144
[26]   Communication-Efficient Distributed Orthogonal Approximate Message Passing for Sparse Signal Recovery [J].
Hisanaga, Ken ;
Isaka, Motohiko .
IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2024, E107A (03) :493-502
[27]   Capacity-Achieving Sparse Superposition Codes via Approximate Message Passing Decoding [J].
Rush, Cynthia ;
Greig, Adam ;
Venkataramanan, Ramji .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (03) :1476-1500
[28]   Sketched Clustering via Hybrid Approximate Message Passing [J].
Byrne, Evan ;
Chatalic, Antoine ;
Gribonval, Remi ;
Schniter, Philip .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2019, 67 (17) :4556-4569
[29]   Hybrid Approximate Message Passing for Generalized Group Sparsity [J].
Fletcher, Alyson K. ;
Rangan, Sundeep .
WAVELETS AND SPARSITY XV, 2013, 8858
[30]   An Overview of Multi-Processor Approximate Message Passing [J].
Zhu, Junan ;
Pilgrim, Ryan ;
Baron, Dror .
2017 51ST ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2017,