Generalized Memory Approximate Message Passing for Generalized Linear Model

被引:3
|
作者
Tian, Feiyan [1 ,2 ,3 ]
Liu, Lei [4 ]
Chen, Xiaoming [1 ,2 ,3 ]
机构
[1] Zhejiang Univ, Coll Informat Sci & Elect Engn, Hangzhou 310027, Peoples R China
[2] Zhejiang Prov Key Lab Informat Proc, Commun & Networking, Hangzhou 310007, Peoples R China
[3] Zhejiang Univ, Int Joint Innovat Ctr, Hangzhou 314400, Peoples R China
[4] Japan Adv Inst Sci & Technol, Sch Informat Sci, Kanazawa, Ishikawa 9208580, Japan
基金
日本学术振兴会;
关键词
Maximum likelihood estimation; Signal processing algorithms; Matched filters; Iterative methods; Rail to rail inputs; Message passing; Mathematical models; Approximate message passing (AMP); generalized vector AMP; generalized memory AMP; right rotationally invariant; Bayes optimality; low complexity; RECOVERY; DYNAMICS;
D O I
10.1109/TSP.2022.3213414
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
For signal reconstruction in a generalized linear model (GLM), generalized approximate message passing (GAMP) is a low-complexity algorithm with many appealing features such as an exact performance characterization in the high-dimensional limit. However, it is viable only when the transformation matrix has independent and identically distributed (IID) entries. Generalized vector AMP (GVAMP) has a wider applicability but with high computational complexity. To overcome the shortcomings of GAMP and GVAMP, we propose a low-complexity and widely applicable generalized memory AMP (GMAMP) framework, including an orthogonal memory linear estimator (MLE) and two orthogonal memory nonlinear estimators (MNLE), which guarantee the asymptotic IID Gaussianity of estimation errors and state evolution (SE) in GMAMP. The proposed GMAMP is universal since the existing AMP, convolutional AMP, orthogonal/vector AMP, GVAMP, and memory AMP (MAMP) are its special instances. More importantly, we provide a principle toward building new advanced AMP-type algorithms based on the proposed GMAMP framework. As an example, we construct a Bayes-optimal GMAMP (BO-GMAMP) algorithm, which adopts a memory match filter estimator to suppress the linear interference, and thus its complexity is comparable to GAMP. Furthermore, we prove that the SE of BO-GMAMP with optimized parameters converges to the same fixed point as that of the high-complexity GVAMP. In other words, BO-GMAMP achieves the replica minimum (i.e., potential Bayes-optimal) mean square error (MSE) if its SE has a unique fixed point. Finally, simulation results are provided to validate the accuracy of the theoretical analysis.
引用
收藏
页码:6404 / 6418
页数:15
相关论文
共 50 条
  • [1] Vector Approximate Message Passing for the Generalized Linear Model
    Schniter, Philip
    Rangan, Sundeep
    Fletcher, Alyson K.
    2016 50TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS AND COMPUTERS, 2016, : 1525 - 1529
  • [2] Generalized Unitary Approximate Message Passing for Double Linear Transformation Model
    Mo, Linlin
    Lu, Xinhua
    Yuan, Jide
    Zhang, Chuanzong
    Wang, Zhongyong
    Popovski, Petar
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 1524 - 1538
  • [3] Approximate message passing with spectral initialization for generalized linear models*
    Mondelli, Marco
    Venkataramanan, Ramji
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2022, 2022 (11):
  • [4] Generalized Approximate Message Passing for Estimation with Random Linear Mixing
    Rangan, Sundeep
    2011 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2011,
  • [5] Approximate Message Passing with Spectral Initialization for Generalized Linear Models
    Mondelli, Marco
    Venkataramanan, Ramji
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130 : 397 - +
  • [6] Parametric Bilinear Generalized Approximate Message Passing
    Parker, Jason T.
    Schniter, Philip
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2016, 10 (04) : 795 - 808
  • [7] Binary Linear Classification and Feature Selection via Generalized Approximate Message Passing
    Ziniel, Justin
    Schniter, Philip
    Sederberg, Per
    2014 48TH ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2014,
  • [8] Estimation in Rotationally Invariant Generalized Linear Models via Approximate Message Passing
    Venkataramanan, Ramji
    Koegler, Kevin
    Mondelli, Marco
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [9] Binary Linear Classification and Feature Selection via Generalized Approximate Message Passing
    Ziniel, Justin
    Schniter, Philip
    Sederberg, Per
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2015, 63 (08) : 2020 - 2032
  • [10] Hybrid Approximate Message Passing for Generalized Group Sparsity
    Fletcher, Alyson K.
    Rangan, Sundeep
    WAVELETS AND SPARSITY XV, 2013, 8858