Preservation of the Global Knowledge by Not-True Distillation in Federated Learning

被引:0
作者
Lee, Gihun [1 ]
Jeong, Minchan [1 ]
Shin, Yongjin [1 ]
Bae, Sangmin [1 ]
Yun, Se-Young [1 ]
机构
[1] Korea Adv Inst Sci & Technol, Daejeon, South Korea
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022) | 2022年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In federated learning, a strong global model is collaboratively learned by aggregating clients' locally trained models. Although this precludes the need to access clients' data directly, the global model's convergence often suffers from data heterogeneity. This study starts from an analogy to continual learning and suggests that forgetting could be the bottleneck of federated learning. We observe that the global model forgets the knowledge from previous rounds, and the local training induces forgetting the knowledge outside of the local distribution. Based on our findings, we hypothesize that tackling down forgetting will relieve the data heterogeneity problem. To this end, we propose a novel and effective algorithm, Federated Not-True Distillation (FedNTD), which preserves the global perspective on locally available data only for the not-true classes. In the experiments, FedNTD shows state-of-the-art performance on various setups without compromising data privacy or incurring additional communication costs.
引用
收藏
页数:14
相关论文
共 58 条
  • [21] Khaled A, 2020, PR MACH LEARN RES, V108, P4519
  • [22] Overcoming catastrophic forgetting in neural networks
    Kirkpatricka, James
    Pascanu, Razvan
    Rabinowitz, Neil
    Veness, Joel
    Desjardins, Guillaume
    Rusu, Andrei A.
    Milan, Kieran
    Quan, John
    Ramalho, Tiago
    Grabska-Barwinska, Agnieszka
    Hassabis, Demis
    Clopath, Claudia
    Kumaran, Dharshan
    Hadsell, Raia
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2017, 114 (13) : 3521 - 3526
  • [23] Konecny J., 2016, NEURIPS WORKSHOP PRI, P1
  • [24] Konen J., 2016, ARXIV161002527, P1
  • [25] Krizhevsky A., 2009, Technical report, P1
  • [26] Li D., 2019, ARXIV191003581
  • [27] Model-Contrastive Federated Learning
    Li, Qinbin
    He, Bingsheng
    Song, Dawn
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 10708 - 10717
  • [28] Li Qinbin, 2021, ARXIV210202079
  • [29] Li T., 2020, Proc Mach Learn Syst, V2, P429
  • [30] Federated Learning: Challenges, Methods, and Future Directions
    Li, Tian
    Sahu, Anit Kumar
    Talwalkar, Ameet
    Smith, Virginia
    [J]. IEEE SIGNAL PROCESSING MAGAZINE, 2020, 37 (03) : 50 - 60