Decentralized RLS With Data-Adaptive Censoring for Regressions Over Large-Scale Networks

被引:17
作者
Wang, Zifeng [1 ]
Yu, Zheng [1 ]
Ling, Qing [2 ]
Berberidis, Dimitris [3 ]
Giannakis, Georgios B. [3 ]
机构
[1] Univ Sci & Technol China, Special Class Gifted Young, Hefei 230026, Anhui, Peoples R China
[2] Sun Yat Sen Univ, Sch Data & Comp Sci, Guangzhou 510275, Guangdong, Peoples R China
[3] Univ Minnesota, Dept Elect & Comp Engn, Minneapolis, MN 55455 USA
基金
美国国家科学基金会;
关键词
Decentralized estimation; networks; recursive least-squares (RLS); data-adaptive censoring; WIRELESS SENSOR NETWORKS; DISTRIBUTED DETECTION; BIG DATA; SQUARES;
D O I
10.1109/TSP.2018.2795594
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The deluge of networked data motivates the development of algorithms for computation-and communication-efficient information processing. In this context, three data-adaptive censoring strategies are introduced to considerably reduce the computation and communication overhead of decentralized recursive least-squares solvers. The first relies on alternating minimization and the stochastic Newton iteration to minimize a network-wide cost, which discards observations with small innovations. In the resultant algorithm, each node performs local data-adaptive censoring to reduce computations while exchanging its local estimate with neighbors so as to consent on a network-wide solution. The communication cost is further reduced by the second strategy, which prevents a node from transmitting its local estimate to neighbors when the innovation it induces to incoming data is minimal. In the third strategy, not only transmitting, but also receiving estimates from neighbors is prohibited when data-adaptive censoring is in effect. For all strategies, a simple criterion is provided for selecting the threshold of innovation to reach a prescribed average data reduction. The novel censoring-based (C) D-RLS algorithms are proved convergent to the optimal argument in the mean-root deviation sense. Numerical experiments validate the effectiveness of the proposed algorithms in reducing computation and communication overhead.
引用
收藏
页码:1634 / 1648
页数:15
相关论文
共 28 条
[1]   Resource Allocation in a Network-Based Cloud Computing Environment: Design Challenges [J].
Abu Sharkh, Mohamed ;
Jammal, Manar ;
Shami, Abdallah ;
Ouda, Abdelkader .
IEEE COMMUNICATIONS MAGAZINE, 2013, 51 (11) :46-52
[2]   Natural gradient works efficiently in learning [J].
Amari, S .
NEURAL COMPUTATION, 1998, 10 (02) :251-276
[3]  
[Anonymous], 1997, Appl. Math.
[4]   Decentralized detection with censoring sensors [J].
Appadwedula, Swaroop ;
Veeravalli, Venugopal V. ;
Jones, Douglas L. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2008, 56 (04) :1362-1373
[5]  
Arroyo-Valles R, 2013, IEEE INT WORK SIGN P, P155, DOI 10.1109/SPAWC.2013.6612031
[6]   Online Censoring for Large-Scale Regressions with Application to Streaming Big Data [J].
Berberidis, Dimitris ;
Kekatos, Vassilis ;
Giannakis, Georgios B. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (15) :3854-3867
[7]   Performance of a Distributed Stochastic Approximation Algorithm [J].
Bianchi, Pascal ;
Fort, Gersende ;
Hachem, Walid .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2013, 59 (11) :7405-7418
[8]   Convex Optimization for Big Data [J].
Cevher, Volkan ;
Becker, Stephen ;
Schmidt, Mark .
IEEE SIGNAL PROCESSING MAGAZINE, 2014, 31 (05) :32-43
[9]  
Giannakis GB, 2016, SCI COMPUT, P461, DOI 10.1007/978-3-319-41589-5_14
[10]  
Grimmett G., 2011, PROBABILITY RANDOM P