Distributed Collaborative Learning with Representative Knowledge Sharing

被引:0
作者
Casey, Joseph [1 ]
Chen, Qianjiao [1 ]
Fan, Mengchen [2 ]
Geng, Baocheng [2 ]
Shterenberg, Roman [1 ]
Chen, Zhong [3 ]
Li, Keren [1 ]
机构
[1] Univ Alabama Birmingham, Dept Math, Birmingham, AL 35294 USA
[2] Univ Alabama Birmingham, Dept Comp Sci, Birmingham, AL 35294 USA
[3] Southern Illinois Univ, Sch Comp, Carbondale, IL 62901 USA
关键词
collaborative transfer learning; knowledge distillation; contrastive learning; federated learning;
D O I
10.3390/math13061004
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Distributed Collaborative Learning (DCL) addresses critical challenges in privacy-aware machine learning by enabling indirect knowledge transfer across nodes with heterogeneous feature distributions. Unlike conventional federated learning approaches, DCL assumes non-IID data and prediction task distributions that span beyond local training data, requiring selective collaboration to achieve generalization. In this work, we propose a novel collaborative transfer learning (CTL) framework that utilizes representative datasets and adaptive distillation weights to facilitate efficient and privacy-preserving collaboration. By leveraging Energy Coefficients to quantify node similarity, CTL dynamically selects optimal collaborators and refines local models through knowledge distillation on shared representative datasets. Simulations demonstrate the efficacy of CTL in improving prediction accuracy across diverse tasks while balancing trade-offs between local and global performance. Furthermore, we explore the impact of data spread and dispersion on collaboration, highlighting the importance of tailored node alignment. This framework provides a scalable foundation for cross-domain generalization in distributed machine learning.
引用
收藏
页数:20
相关论文
共 32 条
  • [11] Huang ZH, 2017, Arxiv, DOI [arXiv:1707.01219, DOI 10.48550/ARXIV.1707.01219]
  • [12] Jeong W, 2021, Arxiv, DOI arXiv:2006.12097
  • [13] Li DL, 2019, Arxiv, DOI arXiv:1910.03581
  • [14] Score-matching representative approach for big data analysis with generalized linear models
    Li, Keren
    Yang, Jie
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2022, 16 (01): : 592 - 635
  • [15] Li T., 2020, P MACH LEARN SYST AU, V2, P429
  • [16] Li T, 2021, PR MACH LEARN RES, V139
  • [17] Malinin A, 2019, Arxiv, DOI arXiv:1905.00076
  • [18] McMahan HB, 2017, PR MACH LEARN RES, V54, P1273
  • [19] FedProc: Prototypical contrastive federated learning on non-IID data
    Mu, Xutong
    Shen, Yulong
    Cheng, Ke
    Geng, Xueli
    Fu, Jiaxuan
    Zhang, Tao
    Zhang, Zhiwei
    [J]. FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 143 : 93 - 104
  • [20] A Survey on Transfer Learning
    Pan, Sinno Jialin
    Yang, Qiang
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2010, 22 (10) : 1345 - 1359