Joint-Way Compression for LDPC Neural Decoding Algorithm With Tensor-Ring Decomposition

被引:5
作者
Liang, Yuanhui [1 ]
Lam, Chan-Tong [1 ]
Ng, Benjamin K. [1 ]
机构
[1] Macao Polytech Univ, Fac Appl Sci, Macau 999078, Peoples R China
关键词
Decoding; Artificial neural networks; Iterative decoding; Tensors; Matrix decomposition; Compression algorithms; Training; Joint-way compression; weight sharing; tensor ring decomposition; LDPC neural decoding;
D O I
10.1109/ACCESS.2023.3252907
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose low complexity joint-way compression algorithms with Tensor-Ring (TR) decomposition and weight sharing to further lower the storage and computational complexity requirements for low density parity check (LDPC) neural decoding. Compared with Tensor-Train (TT) decomposition, TR decomposition is more flexible for the selection of ranks, and is also conducive to the use of rank optimization algorithms. In particular, we use TR decomposition to decompose not only the weight parameter matrix of Neural Normalized Min-Sum (NNMS)+ algorithm, but also the message matrix transmitted between variable nodes and check nodes. Furthermore, we combine the TR decomposition and weight sharing algorithm, called joint-way compression, to further lower the complexity of LDPC neural decoding algorithm. We show that the joint-way compression algorithm can achieve better compression efficiency than a single compression algorithm while maintaining a comparable bit error rate (BER) performance. From the numerical experiments, we found that all the compression algorithms with appropriate selection of ranks give almost no performance degradation and that the TRwm-ssNNMS+ algorithm, which combines the spatial sharing and TR decomposition of both weight and message matrix, has the best compression efficiency. Compared with our TT-NNMS+ algorithm proposed in Yuanhui et al. (2022), the number of parameters is reduced by about 70 times and the number of multiplications is reduced by about 6 times.
引用
收藏
页码:22871 / 22879
页数:9
相关论文
共 18 条
[1]   Pruning and Quantizing Neural Belief Propagation Decoders [J].
Buchberger, Andreas ;
Hager, Christian ;
Pfister, Henry D. ;
Schmalen, Laurent ;
Graell i Amat, Alexandre .
IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (07) :1957-1966
[2]  
Buchberger A, 2020, IEEE INT SYMP INFO, P338, DOI [10.1109/ISIT44484.2020.9174097, 10.1109/isit44484.2020.9174097]
[3]   Tensor Decompositions for Signal Processing Applications [J].
Cichocki, Andrzej ;
Mandic, Danilo P. ;
Anh Huy Phan ;
Caiafa, Cesar F. ;
Zhou, Guoxu ;
Zhao, Qibin ;
De Lathauwer, Lieven .
IEEE SIGNAL PROCESSING MAGAZINE, 2015, 32 (02) :145-163
[4]   Learning to Decode Protograph LDPC Codes [J].
Dai, Jincheng ;
Tan, Kailin ;
Si, Zhongwei ;
Niu, Kai ;
Chen, Mingzhe ;
Poor, H. Vincent ;
Cui, Shuguang .
IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (07) :1983-1999
[5]   Model Compression and Hardware Acceleration for Neural Networks: A Comprehensive Survey [J].
Deng, Lei ;
Li, Guoqi ;
Han, Song ;
Shi, Luping ;
Xie, Yuan .
PROCEEDINGS OF THE IEEE, 2020, 108 (04) :485-532
[6]  
Fang H., 2020, PROC IEEE INT C COMM, P1
[7]  
Lian M, 2019, IEEE INT SYMP INFO, P161, DOI [10.1109/ISIT.2019.8849419, 10.1109/isit.2019.8849419]
[8]   A Low-Complexity Neural Normalized Min-Sum LDPC Decoding Algorithm Using Tensor-Train Decomposition [J].
Liang, Yuanhui ;
Lam, Chan-Tong ;
Ng, Benjamin K. .
IEEE COMMUNICATIONS LETTERS, 2022, 26 (12) :2914-2918
[9]  
Lu HY, 2015, PROC CVPR IEEE, P806, DOI 10.1109/CVPR.2015.7298681
[10]  
Lugosch L, 2018, CONF REC ASILOMAR C, P594, DOI 10.1109/ACSSC.2018.8645388