pFedEff: An Efficient and Personalized Federated Cognitive Learning Framework in Multiagent Systems

被引:2
|
作者
Shi, Hongjian [1 ]
Zhang, Jianqing [1 ]
Fan, Shuming [1 ]
Ma, Ruhui [1 ]
Guan, Haibing [1 ]
机构
[1] Shanghai Jiao Tong Univ, Sch Elect Informat & Elect Engn, Shanghai 200240, Peoples R China
关键词
Training; Computational modeling; Servers; Solid modeling; Multi-agent systems; Federated learning; Adaptation models; Cognitive learning (CL); model pruning; multiagent system; personalized federated learning (FL); INTELLIGENCE; DESCENT;
D O I
10.1109/TCDS.2023.3288985
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the increase in data volume and environment complexity, real-world problems require more advanced algorithms to acquire useful information for further analysis or decision making. Cognitive learning (CL) effectively handles incomplete information, and multiagent systems can provide enough data for analysis. Inspired by distributed machine learning, federated learning (FL) has become an efficient framework for implementing CL algorithms in multiagent systems while preserving user privacy. However, traditional communication optimizations on the FL framework suffer from either large communication volumes or large accuracy loss. In this article, we propose pFedEff, a personalized FL framework with efficient communication that can reduce communication volume and preserve training accuracy. pFedEff uses two magnitude masks, two importance masks, and a personalized aggregation method to reduce the model and update size while maintaining the training accuracy. Specifically, we use a pretraining magnitude mask for approximated regularization to reduce the magnitude of ineffective parameters during training. We also use a post-training magnitude mask to eliminate the low-magnitude parameters after training. Furthermore, we use uploading and downloading importance masks to reduce the communication volume in both upload and download streams. Our experimental results show that pFedEff reduces up to 94% communication volume with only a 1% accuracy loss over other state-of-the-art FL algorithms. In addition, we conduct multiple ablation studies to evaluate the influence of hyperparameters in pFedEff, which shows the flexibility of pFedEff and its applicability in different scenarios.
引用
收藏
页码:31 / 45
页数:15
相关论文
共 50 条
  • [21] CPPer-FL: Clustered Parallel Training for Efficient Personalized Federated Learning
    Zhang, Ran
    Liu, Fangqi
    Liu, Jiang
    Chen, Mingzhe
    Tang, Qinqin
    Huang, Tao
    Yu, F. Richard
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 9424 - 9436
  • [22] Personalized Federated Learning With Server-Side Information
    Song, Jaehun
    Oh, Min-Hwan
    Kim, Hyung-Sin
    IEEE ACCESS, 2022, 10 : 120245 - 120255
  • [23] CF4FL: A Communication Framework for Federated Learning in Transportation Systems
    Sangdeh, Pedram Kheirkhah
    Li, Chengzhang
    Pirayesh, Hossein
    Zhang, Shichen
    Zeng, Huacheng
    Hou, Y. Thomas
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (06) : 3821 - 3836
  • [24] Communication-Efficient Federated Learning for Large-Scale Multiagent Systems in ISAC: Data Augmentation With Reinforcement Learning
    Ouyang, Wenjiang
    Liu, Qian
    Mu, Junsheng
    AI-Dulaimi, Anwer
    Jing, Xiaojun
    Liu, Qilie
    IEEE SYSTEMS JOURNAL, 2024, : 1893 - 1904
  • [25] Dual Model Pruning Enables Efficient Federated Learning in Intelligent Transportation Systems
    Pei, Jiaming
    Li, Wei
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2024,
  • [26] Energy Optimization and Lightweight Design for Efficient Federated Learning in Wireless Edge Systems
    Lei, Lei
    Yuan, Yaxiong
    Zhou, Yu
    Yang, Yang
    Luo, Yu
    Pu, Lina
    Chatzinotas, Symeon
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2024, 73 (09) : 13542 - 13557
  • [27] Model Pruning Enables Efficient Federated Learning on Edge Devices
    Jiang, Yuang
    Wang, Shiqiang
    Valls, Victor
    Ko, Bong Jun
    Lee, Wei-Han
    Leung, Kin K.
    Tassiulas, Leandros
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (12) : 10374 - 10386
  • [28] Privacy-Preserving Heterogeneous Personalized Federated Learning With Knowledge
    Pan, Yanghe
    Su, Zhou
    Ni, Jianbing
    Wang, Yuntao
    Zhou, Jinhao
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (06): : 5969 - 5982
  • [29] AsyncFedGAN: An Efficient and Staleness-Aware Asynchronous Federated Learning Framework for Generative Adversarial Networks
    Manu, Daniel
    Alazzwi, Abee
    Yao, Jingjing
    Lin, Youzuo
    Sun, Xiang
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2025, 36 (03) : 553 - 569
  • [30] An Efficient Framework for Clustered Federated Learning
    Ghosh, Avishek
    Chung, Jichan
    Yin, Dong
    Ramchandran, Kannan
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (12) : 8076 - 8091