Accelerating Neural BP-Based Decoder Using Coded Distributed Computing

被引:1
作者
Han, Xuesong [1 ]
Liu, Rui [1 ]
Li, Yong [1 ]
Yi, Chen [2 ]
He, Jiguang [3 ]
Wang, Ming [4 ]
机构
[1] Chongqing Univ, Coll Comp Sci, Chongqing 400044, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Sch Commun & Informat Engn, Chongqing 400065, Peoples R China
[3] Technol Innovat Inst, Abu Dhabi 00000, U Arab Emirates
[4] North Carolina State Univ, Dept Comp Sci, Raleigh, NC 27606 USA
关键词
BP-based decoder; coding theory; distributed computing; neural network; straggler resilience;
D O I
10.1109/TVT.2024.3391836
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
While neural BP-based (NBP) decoders exhibit superior error correction performance compared to belief-propagation (BP) decoders, the NBP decoder's high computational and memory requirements impede its practical deployment in communication systems. To overcome this challenge, we propose a Coded Neural BP (CNBP) scheme to accelerate the NBP decoder in distributed environments, while considering storage constraints and providing resilience to stragglers. The key idea is to reformulate the primary operations of the NBP decoder as matrix-vector multiplications by introducing weight matrices and transformations. Based on this, the acceleration of the NBP decoder is achieved by speeding up matrix-vector multiplications using coded distributed computing. Extensive experiments conducted on Amazon EC2 cluster demonstrate that CNBP achieves notable acceleration and scalability performance without any loss in error correction performance.
引用
收藏
页码:13997 / 14002
页数:6
相关论文
共 17 条
  • [1] Pruning and Quantizing Neural Belief Propagation Decoders
    Buchberger, Andreas
    Hager, Christian
    Pfister, Henry D.
    Schmalen, Laurent
    Graell i Amat, Alexandre
    [J]. IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (07) : 1957 - 1966
  • [2] Neural decoders with permutation invariant structure
    Chen, Xiangyu
    Ye, Min
    [J]. JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2023, 360 (08): : 5481 - 5503
  • [3] Coded Sparse Matrix Computation Schemes That Leverage Partial Stragglers
    Das, Anindya Bijoy
    Ramamoorthy, Aditya
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (06) : 4156 - 4181
  • [4] Numerically Stable Polynomially Coded Computing
    Fahim, Mohammad
    Cadambe, Viveck R.
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2021, 67 (05) : 2758 - 2785
  • [5] COMPRESSION AND ACCELERATION OF NEURAL NETWORKS FOR COMMUNICATIONS
    Guo, Jiajia
    Wang, Jinghe
    Wen, Chao-Kai
    Jin, Shi
    Li, Geoffrey Ye
    [J]. IEEE WIRELESS COMMUNICATIONS, 2020, 27 (04) : 110 - 117
  • [6] Krishnan M. Nikhil, 2021, IEEE Journal on Selected Areas in Information Theory, V2, P830, DOI 10.1109/JSAIT.2021.3104970
  • [7] A Fundamental Tradeoff Between Computation and Communication in Distributed Computing
    Li, Songze
    Maddah-Ali, Mohammad Ali
    Yu, Qian
    Avestimehr, A. Salman
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2018, 64 (01) : 109 - 128
  • [8] Lian M, 2019, IEEE INT SYMP INFO, P161, DOI [10.1109/ISIT.2019.8849419, 10.1109/isit.2019.8849419]
  • [9] Deep Learning Methods for Improved Decoding of Linear Codes
    Nachmani, Eliya
    Marciano, Elad
    Lugosch, Loren
    Gross, Warren J.
    Burshtein, David
    Be'ery, Yair
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2018, 12 (01) : 119 - 131
  • [10] Nachmani E, 2016, ANN ALLERTON CONF, P341, DOI 10.1109/ALLERTON.2016.7852251