Clustered Federated Learning via Generalized Total Variation Minimization

被引:3
作者
Sarcheshmehpour, Yasmin [1 ]
Tian, Yu [1 ]
Zhang, Linli [2 ]
Jung, Alexander [1 ]
机构
[1] Aalto Univ, Dept Comp Sci, Espoo 02150, Finland
[2] Shanghai Jiao Tong Univ, Shanghai 200030, Peoples R China
关键词
Federated learning; clustering; complex networks; total variation; regularization; NETWORK; OPTIMIZATION;
D O I
10.1109/TSP.2023.3322848
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We study optimization methods to train local (or personalized) models for decentralized collections of local datasets with an intrinsic network structure. This network structure arises from domain-specific notions of similarity between local datasets. Examples of such notions include spatio-temporal proximity, statistical dependencies or functional relations. Our main conceptual contribution is to formulate federated learning as generalized total variation (GTV) minimization. This formulation unifies and considerably extends existing federated learning methods. It is highly flexible and can be combined with a broad range of parametric models, including generalized linear models or deep neural networks. Our main algorithmic contribution is a fully decentralized federated learning algorithm. This algorithm is obtained by applying an established primal-dual method to solve GTV minimization. It can be implemented as message passing and is robust against inexact computations that arise from limited computational resources, including processing time or bandwidth. Our main analytic contribution is an upper bound on the deviation between the local model parameters learnt by our algorithm and an oracle-based clustered federated learning method. This upper bound reveals conditions on the local models and the network structure of local datasets such that GTV minimization is able to pool (nearly) homogeneous local datasets.
引用
收藏
页码:4240 / 4256
页数:17
相关论文
共 50 条
  • [41] Global total variation minimization
    Dibos, F
    Koepfler, G
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2000, 37 (02) : 646 - 664
  • [42] Convex MR brain image reconstruction via non-convex total variation minimization
    Liu, Yilin
    Du, Huiqian
    Wang, Zexian
    Mei, Wenbo
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2018, 28 (04) : 246 - 253
  • [43] Federated Learning Via Inexact ADMM
    Zhou, Shenglong
    Li, Geoffrey Ye
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (08) : 9699 - 9708
  • [44] Ameliorating clustered federated learning using grey wolf optimization algorithm for healthcare IoT devicesAmeliorating clustered federated learning...R. K. Chaudhary et al.
    Rajesh Kumar Chaudhary
    Ravinder Kumar
    Nitin Saxena
    The Journal of Supercomputing, 81 (8)
  • [45] Clustered Federated Learning Based on Client's Prototypes
    Lai, Weimin
    Xu, Zirong
    Yan, Qiao
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 909 - 914
  • [46] A dynamic adaptive iterative clustered federated learning scheme
    Du, Run
    Xu, Shuo
    Zhang, Rui
    Xu, Lijuan
    Xia, Hui
    KNOWLEDGE-BASED SYSTEMS, 2023, 276
  • [47] Generalized Total Variation: Tying the Knots
    Selesnick, Ivan W.
    IEEE SIGNAL PROCESSING LETTERS, 2015, 22 (11) : 2009 - 2013
  • [48] On semismooth Newton's methods for total variation minimization
    Ng, Michael K.
    Qi, Liqun
    Yang, Yu-Fei
    Huang, Yu-Mei
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2007, 27 (03) : 265 - 276
  • [49] On-The-Fly Approximation of Multivariate Total Variation Minimization
    Frecon, Jordan
    Pustelnik, Nelly
    Abry, Patrice
    Condat, Laurent
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (09) : 2355 - 2364
  • [50] On Semismooth Newton’s Methods for Total Variation Minimization
    Michael K. Ng
    Liqun Qi
    Yu-fei Yang
    Yu-mei Huang
    Journal of Mathematical Imaging and Vision, 2007, 27 : 265 - 276