Distributed Jointly Sparse Bayesian Learning With Quantized Communication

被引:9
作者
Hua, Junhao [1 ,2 ]
Li, Chunguang [1 ,2 ]
机构
[1] Zhejiang Univ, Coll Informat Sci & Elect Engn, Hangzhou 310027, Zhejiang, Peoples R China
[2] Zhejiang Univ, Zhejiang Prov Key Lab Informat Proc Commun & Netw, Hangzhou 310027, Zhejiang, Peoples R China
来源
IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS | 2018年 / 4卷 / 04期
基金
中国国家自然科学基金;
关键词
Distributed estimation; joint sparsity; sparse Bayesian learning; quantization; variational Bayes; quantized consensus; sensor networks; SENSOR NETWORKS; ACTUATOR NETWORKS; WIRELESS SENSOR; CONSENSUS; OPTIMIZATION; ALGORITHMS; APPROXIMATION; CONSTRAINTS; RECOVERY; VECTORS;
D O I
10.1109/TSIPN.2018.2832026
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We consider the problem of the distributed recovery of jointly sparse signals by a number of nodes in a sensor network, using multiple noisy linear measurements. In the literature, distributed Bayesian algorithms have been proposed to tackle this problem, most of which assume that nodes can transmit real data with infinite precision. However, in many practical applications, sensor networks have a limited communication bandwidth and finite capacity channels, and digital quantization of the transmitted data is inevitable. In this paper, we consider the case that the transmitted quantities/messages between nodes (instead of the measurement data) are quantized with discrete value and finite precision. We formulate a fully hierarchical jointly sparse Bayesian model and propose a novel distributed variational Bayesian (VB) algorithm, which uses only the quantized transmitted messages. In the proposed VB algorithm, an inexact alternating direction method of multipliers is developed for achieving quantized consensus. We theoretically analyze the convergence of the proposed algorithm and study the effect of digital quantization. Through numerical simulations, we find a counterintuitive result that the proposed quantized algorithm can even perform better than the corresponding unquantized algorithm and the centralized counterpart.
引用
收藏
页码:769 / 782
页数:14
相关论文
共 54 条
[1]  
[Anonymous], 2000, ACM C UNCERTAINTY AR
[2]  
[Anonymous], FOUND TRENDS MACH LE
[3]  
[Anonymous], THESIS
[4]  
[Anonymous], P AS C SIGN SYST COM
[5]  
Attias H, 1999, UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, P21
[6]  
Attias H, 2000, ADV NEUR IN, V12, P209
[7]   Distributed average consensus with dithered quantization [J].
Aysal, Tuncer Can ;
Coates, Mark J. ;
Rabbat, Michael G. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2008, 56 (10) :4905-4918
[8]   Estimation of Sparse MIMO Channels with Common Support [J].
Barbotin, Yann ;
Hormati, Ali ;
Rangan, Sundeep ;
Vetterli, Martin .
IEEE TRANSACTIONS ON COMMUNICATIONS, 2012, 60 (12) :3705-3716
[9]  
Bertsekas D. P., 2004, Nonlinear Programming
[10]   Randomized gossip algorithms [J].
Boyd, Stephen ;
Ghosh, Arpita ;
Prabhakar, Balaji ;
Shah, Devavrat .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (06) :2508-2530