Learning Over Multitask Graphs-Part II: Performance Analysis

被引:7
|
作者
Nassif, Roula [1 ,2 ]
Vlaski, Stefan [1 ]
Richard, Cedric [3 ]
Sayed, Ali H. [1 ]
机构
[1] Ecole Polytech Fed Lausanne, Inst Elect Engn, CH-1015 Lausanne, Switzerland
[2] Amer Univ Beirut, Beirut 11072020, Lebanon
[3] Univ Nice Sophia Antipolis, F-06100 Nice, France
来源
IEEE OPEN JOURNAL OF SIGNAL PROCESSING | 2020年 / 1卷 / 01期
基金
瑞士国家科学基金会;
关键词
Multitask distributed inference; diffusion strategy; smoothness prior; graph Laplacian regularization; gradient noise; steady-state performance; NETWORKS; ALGORITHMS; BEHAVIOR; LMS;
D O I
10.1109/OJSP.2020.2989031
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Part I of this paper formulated a multitask optimization problem where agents in the network have individual objectives to meet, or individual parameter vectors to estimate, subject to a smoothness condition over the graph. A diffusion strategy was devised that responds to streaming data and employs stochastic approximations in place of actual gradient vectors, which are generally unavailable. The approach relied on minimizing a global cost consisting of the aggregate sum of individual costs regularized by a term that promotes smoothness. We examined the first-order, the second-order, and the fourth-order stability of the multitask learning algorithm. The results identified conditions on the step-size parameter, regularization strength, and data characteristics in order to ensure stability. This Part II examines steady-state performance of the strategy. The results reveal explicitly the influence of the network topology and the regularization strength on the network performance and provide insights into the design of effective multitask strategies for distributed inference over networks.
引用
收藏
页码:46 / 63
页数:18
相关论文
共 50 条
  • [1] Learning Over Multitask Graphs-Part I: Stability Analysis
    Nassif, Roula
    Vlaski, Stefan
    Richard, Cedric
    Sayed, Ali H.
    IEEE OPEN JOURNAL OF SIGNAL PROCESSING, 2020, 1 : 28 - 45
  • [2] A Regularization Framework for Learning Over Multitask Graphs
    Nassif, Roula
    Vlaski, Stefan
    Richard, Cedric
    Sayed, Ali H.
    IEEE SIGNAL PROCESSING LETTERS, 2019, 26 (02) : 297 - 301
  • [3] A New Notion of Effective Resistance for Directed Graphs-Part II: Computing Resistances
    Young, George Forrest
    Scardovi, Luca
    Leonard, Naomi Ehrich
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2016, 61 (07) : 1737 - 1752
  • [4] Distributed Inference over Multitask Graphs under Smoothness
    Nassif, Roula
    Vlaski, Stefan
    Sayed, Ali H.
    2018 IEEE 19TH INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (SPAWC), 2018, : 631 - 635
  • [5] Multitask Learning Over Graphs: An Approach for Distributed, Streaming Machine Learning
    Nassif, Roula
    Vlaski, Stefan
    Richard, Cedric
    Chen, Jie
    Sayed, Ali H.
    IEEE SIGNAL PROCESSING MAGAZINE, 2020, 37 (03) : 14 - 25
  • [6] Adaptation and Learning Over Networks Under Subspace Constraints-Part II: Performance Analysis
    Nassif, Roula
    Vlaski, Stefan
    Sayed, Ali H.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 2948 - 2962
  • [7] On the Learning Behavior of Adaptive Networks-Part II: Performance Analysis
    Chen, Jianshu
    Sayed, Ali H.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2015, 61 (06) : 3518 - 3548
  • [8] Online Distributed Learning Over Graphs With Multitask Graph-Filter Models
    Hua, Fei
    Nassif, Roula
    Richard, Cedric
    Wang, Haiyan
    Sayed, Ali H.
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2020, 6 : 63 - 77
  • [9] A New Notion of Effective Resistance for Directed Graphs-Part I: Definition and Properties
    Young, George Forrest
    Scardovi, Luca
    Leonard, Naomi Ehrich
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2016, 61 (07) : 1727 - 1736
  • [10] Performance limits of stochastic sub-gradient learning, part II: Multi-agent case
    Ying, Bicheng
    Sayed, Ali H.
    SIGNAL PROCESSING, 2018, 144 : 253 - 264