Distributed Collaborative Learning with Representative Knowledge Sharing

被引:0
作者
Casey, Joseph [1 ]
Chen, Qianjiao [1 ]
Fan, Mengchen [2 ]
Geng, Baocheng [2 ]
Shterenberg, Roman [1 ]
Chen, Zhong [3 ]
Li, Keren [1 ]
机构
[1] Univ Alabama Birmingham, Dept Math, Birmingham, AL 35294 USA
[2] Univ Alabama Birmingham, Dept Comp Sci, Birmingham, AL 35294 USA
[3] Southern Illinois Univ, Sch Comp, Carbondale, IL 62901 USA
关键词
collaborative transfer learning; knowledge distillation; contrastive learning; federated learning;
D O I
10.3390/math13061004
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Distributed Collaborative Learning (DCL) addresses critical challenges in privacy-aware machine learning by enabling indirect knowledge transfer across nodes with heterogeneous feature distributions. Unlike conventional federated learning approaches, DCL assumes non-IID data and prediction task distributions that span beyond local training data, requiring selective collaboration to achieve generalization. In this work, we propose a novel collaborative transfer learning (CTL) framework that utilizes representative datasets and adaptive distillation weights to facilitate efficient and privacy-preserving collaboration. By leveraging Energy Coefficients to quantify node similarity, CTL dynamically selects optimal collaborators and refines local models through knowledge distillation on shared representative datasets. Simulations demonstrate the efficacy of CTL in improving prediction accuracy across diverse tasks while balancing trade-offs between local and global performance. Furthermore, we explore the impact of data spread and dispersion on collaboration, highlighting the importance of tailored node alignment. This framework provides a scalable foundation for cross-domain generalization in distributed machine learning.
引用
收藏
页数:20
相关论文
共 32 条
  • [1] Acar DAE, 2021, Arxiv, DOI [arXiv:2111.04263, DOI 10.48550/ARXIV.2111.04263]
  • [2] Bennett Kristin P., 1992, Optim. Methods Softw., V1, P23, DOI 10.1080/10556789208805504
  • [3] Chen H.-Y., 2020, arXiv
  • [4] Chen YH, 2024, PROC CVPR IEEE, P12077, DOI 10.1109/CVPR52733.2024.01148
  • [5] On the Efficacy of Knowledge Distillation
    Cho, Jang Hyun
    Hariharan, Bharath
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 4793 - 4801
  • [6] Fallah A, 2020, Arxiv, DOI arXiv:2002.07948
  • [7] Fan MC, 2025, Arxiv, DOI [arXiv:2501.16174, 10.48550/arXiv.2501.16174, DOI 10.48550/ARXIV.2501.16174]
  • [8] Gretton A., 2009, Covariate shift and local learning by distribution matching Dataset Shift in Machine Learning, V131, P160
  • [9] Guo D, 2025, PREPRINT
  • [10] Hinton G, 2015, Arxiv, DOI [arXiv:1503.02531, 10.48550/arXiv.1503.02531]