Distributed Memory Approximate Message Passing

被引:0
|
作者
Lu, Jun [1 ]
Liu, Lei [1 ]
Huang, Shunqi [2 ]
Wei, Ning [3 ,4 ]
Chen, Xiaoming [1 ]
机构
[1] Zhejiang Univ, Coll Informat Sci & Elect Engn, Zhejiang Prov Key Lab Informat Proc Commun & Netwo, Hangzhou 310007, Peoples R China
[2] Japan Adv Inst Sci & Technol, Sch Informat Sci, Nomi 9231292, Japan
[3] ZTE Corp, Shenzhen 518055, Peoples R China
[4] State Key Lab Mobile Network & Mobile Multimedia T, Shenzhen 518055, Peoples R China
基金
中国国家自然科学基金;
关键词
Vectors; Transforms; Maximum likelihood estimation; Costs; Bayes methods; Message passing; Matrix converters; Consensus propagation; distributed information processing; memory approximate message passing; DYNAMICS;
D O I
10.1109/LSP.2024.3460478
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Approximate message passing (AMP) algorithms are iterative methods for signal recovery in noisy linear systems. In some scenarios, AMP algorithms need to operate within a distributed network. To address this challenge, the distributed extensions of AMP (D-AMP, FD-AMP) and orthogonal/vector AMP (D-OAMP/D-VAMP) were proposed, but they still inherit the limitations of centralized algorithms. In this letter, we propose distributed memory AMP (D-MAMP) to overcome the IID matrix limitation of D-AMP/FD-AMP, as well as the high complexity and heavy communication cost of D-OAMP/D-VAMP. We introduce a matrix-by-vector variant of MAMP tailored for distributed computing. Leveraging this variant, D-MAMP enables each node to execute computations utilizing locally available observation vectors and transform matrices. Meanwhile, global summations of locally updated results are conducted through message interaction among nodes. For acyclic graphs, D-MAMP converges to the same mean square error performance as the centralized MAMP.
引用
收藏
页码:2660 / 2664
页数:5
相关论文
共 50 条
  • [1] Memory Approximate Message Passing
    Liu, Lei
    Huang, Shunqi
    Kurkoski, Brian M.
    2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 1379 - 1384
  • [2] DISTRIBUTED APPROXIMATE MESSAGE PASSING WITH SUMMATION PROPAGATION
    Hayakawa, Ryo
    Nakai, Ayano
    Hayashi, Kazunori
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 4104 - 4108
  • [3] Distributed Approximate Message Passing for Sparse Signal Recovery
    Han, Puxiao
    Niu, Ruixin
    Ren, Mengqi
    Eldar, Yonina C.
    2014 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2014, : 497 - 501
  • [4] Approximate Message Passing Algorithms for Distributed Iterative Receiver
    Yue, Ziqi
    Guo, Qing
    FREQUENZ, 2014, 68 (3-4) : 177 - 181
  • [5] Performance Analysis of Approximate Message Passing for Distributed Compressed Sensing
    Hannak, Gabor
    Perelli, Alessandro
    Goertz, Norbert
    Matz, Gerald
    Davies, Mike E.
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2018, 12 (05) : 857 - 870
  • [6] Generalized Memory Approximate Message Passing for Generalized Linear Model
    Tian, Feiyan
    Liu, Lei
    Chen, Xiaoming
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 6404 - 6418
  • [7] Vector Approximate Message Passing
    Rangan, Sundeep
    Schniter, Philip
    Fletcher, Alyson K.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (10) : 6664 - 6684
  • [8] Vector Approximate Message Passing
    Rangan, Sundeep
    Schniter, Philip
    Fletcher, Alyson K.
    2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 1588 - 1592
  • [9] Hybrid Approximate Message Passing
    Rangan, Sundeep
    Fletcher, Alyson K.
    Goyal, Vivek K.
    Byrne, Evan
    Schniter, Philip
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (17) : 4577 - 4592
  • [10] On Convergence of Approximate Message Passing
    Caltagirone, Francesco
    Zdeborova, Lenka
    Krzakala, Florent
    2014 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2014, : 1812 - 1816