Federated Class-Incremental Learning With Dynamic Feature Extractor Fusion

被引:1
作者
Lu, Yanyan [1 ]
Yang, Lei [1 ]
Chen, Hao-Rui [1 ]
Cao, Jiannong [2 ]
Lin, Wanyu
Long, Saiqin [3 ]
机构
[1] South China Univ Technol, Sch Software Engn, Guangzhou 510006, Peoples R China
[2] Dept Comp, Hong Kong, Peoples R China
[3] Jinan Univ, Coll Informat Sci & Technol, Guangzhou 510632, Peoples R China
基金
中国国家自然科学基金;
关键词
Federated learning; class-Incremental learning; feature extractor; exemplar storing;
D O I
10.1109/TMC.2024.3419096
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated class-incremental learning (FCIL) allows multiple clients in a distributed environment to learn models collaboratively from evolving data streams, where new classes arrive continually at each client. Some existing works in FCIL combine traditional federated learning methods with class-incremental methods. However, the global model affected by data heterogeneity can aggravate local forgetting through the direct combination of traditional methods. To tackle this issue, we propose FCIDF, a novel Federated Class-Incremental learning approach based on Dynamic feature extractor Fusion. FCIDF learns personalized and incremental models for each client by introducing personalized fusion rates to integrate global knowledge into local features. Leveraging meta-learning during each incremental round, FCIDF ensures involvement of both old and new task knowledge in personalized training. Besides, we further propose a new Storing strategy based on Accumulated Global Feature Means (AGFMS), which helps the model review unbiased old knowledge and compensates for local forgetting. Experiment results show that FCIDF outperforms the baseline methods in both accuracy and forgetting on most settings, and AGFMS improves the performance of FCIDF on most evaluated scales.
引用
收藏
页码:12969 / 12982
页数:14
相关论文
共 39 条
[1]  
Babakniya S., 2023, Advances in Neural Information Processing Systems, V36, P66408
[2]   Federated learning with hierarchical clustering of local updates to improve training on non-IID data [J].
Briggs, Christopher ;
Fan, Zhong ;
Andras, Peter .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
[3]   End-to-End Incremental Learning [J].
Castro, Francisco M. ;
Marin-Jimenez, Manuel J. ;
Guil, Nicolas ;
Schmid, Cordelia ;
Alahari, Karteek .
COMPUTER VISION - ECCV 2018, PT XII, 2018, 11216 :241-257
[4]   Smart Home 2.0: Innovative Smart Home System Powered by Botanical IoT and Emotion Detection [J].
Chen, Min ;
Yang, Jun ;
Zhu, Xuan ;
Wang, Xiaofei ;
Liu, Mengchen ;
Song, Jeungeun .
MOBILE NETWORKS & APPLICATIONS, 2017, 22 (06) :1159-1169
[5]   No One Left Behind: Real-World Federated Class-Incremental Learning [J].
Dong, Jiahua ;
Li, Hongliu ;
Cong, Yang ;
Sun, Gan ;
Zhang, Yulun ;
Van Gool, Luc .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (04) :2054-2070
[6]   Federated Class-Incremental Learning [J].
Dong, Jiahua ;
Wang, Lixu ;
Fang, Zhen ;
Sun, Gan ;
Xu, Shichao ;
Wang, Xiao ;
Zhu, Qi .
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, :10154-10163
[7]   Astraea: Self-balancing Federated Learning for Improving Classification Accuracy of Mobile Deep Learning Applications [J].
Duan, Moming ;
Liu, Duo ;
Chen, Xianzhang ;
Tan, Yujuan ;
Ren, Jinting ;
Qiao, Lei ;
Liang, Liang .
2019 IEEE 37TH INTERNATIONAL CONFERENCE ON COMPUTER DESIGN (ICCD 2019), 2019, :246-254
[8]  
Finn C, 2017, PR MACH LEARN RES, V70
[9]   An Efficient Framework for Clustered Federated Learning [J].
Ghosh, Avishek ;
Chung, Jichan ;
Yin, Dong ;
Ramchandran, Kannan .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (12) :8076-8091
[10]   DCIGAN: A Distributed Class-Incremental Learning Method Based on Generative Adversarial Networks [J].
Guan, Hongtao ;
Wang, Yijie ;
Ma, Xingkong ;
Li, Yongmou .
2019 IEEE INTL CONF ON PARALLEL & DISTRIBUTED PROCESSING WITH APPLICATIONS, BIG DATA & CLOUD COMPUTING, SUSTAINABLE COMPUTING & COMMUNICATIONS, SOCIAL COMPUTING & NETWORKING (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2019), 2019, :768-775