TinyFL_HKD: Enhancing Edge AI Federated Learning With Hierarchical Knowledge Distillation Framework

被引:0
|
作者
Hung, Chung-Wen [1 ]
Tsai, Cheng-Yu [1 ]
Wang, Chun-Chieh [1 ]
Lee, Ching-Hung [2 ,3 ]
机构
[1] Natl Yunlin Univ Sci & Technol, Dept Elect Engn, Yunlin 64002, Taiwan
[2] Natl Yang Ming Chiao Tung Univ, Inst Elect & Control Engn, Hsinchu 30010, Taiwan
[3] Chun Yuan Christian Univ, Dept Elect Engn, Taoyuan City 320314, Taiwan
关键词
Computational modeling; Training; Artificial intelligence; Data privacy; Vectors; Random access memory; Nonvolatile memory; Ions; Ferroelectric films; Servers; encryption; federated learning; hierarchical learning; ATTACKS; PRIVACY;
D O I
10.1109/JSEN.2025.3544861
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With the rapid evolution of artificial intelligence (AI) and the Internet of Things (IoT), machine learning is increasingly being integrated into embedded systems, bringing computational capabilities closer to where data are generated. This article introduces a tiny federated learning framework, which concerns privacy, personalized training, and the constrained computational resources of edge platforms by introducing a novel hierarchical knowledge distillation (HKD), called TinyFL_HKD. The HKD leverages hierarchical learning and advanced encryption security (AES) schemes to ensure data privacy and security. It employs knowledge distillation to reduce model complexity for implementation in edge devices and enhance personalization. The performance of TinyFL_HKD is introduced by using two datasets: the tool wear dataset and the PHM 2010 Data Challenge dataset. Experimental results indicate that the HKD framework surpasses traditional federated averaging (FedAvg) and personalized federated learning (PFL) algorithms in both model accuracy and computational efficiency. This establishes HKD as a resilient solution for edge AI applications.
引用
收藏
页码:12038 / 12047
页数:10
相关论文
共 50 条
  • [1] Knowledge Distillation Assisted Robust Federated Learning: Towards Edge Intelligence
    Qiao, Yu
    Adhikary, Apurba
    Kim, Ki Tae
    Zhang, Chaoning
    Hong, Choong Seon
    ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, : 843 - 848
  • [2] A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
    Su, Caiyu
    Wei, Jinri
    Lei, Yuan
    Li, Jiahui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [3] Heterogeneous Federated Learning Framework for IIoT Based on Selective Knowledge Distillation
    Guo, Sheng
    Chen, Hui
    Liu, Yang
    Yang, Chengyi
    Li, Zengxiang
    Jin, Cheng Hao
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2025, 21 (02) : 1078 - 1089
  • [4] A Prototype-Based Knowledge Distillation Framework for Heterogeneous Federated Learning
    Lyu, Feng
    Tang, Cheng
    Deng, Yongheng
    Liu, Tong
    Zhang, Yongmin
    Zhang, Yaoxue
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 37 - 47
  • [5] Efficient edge AI implementation for IoT device identification for hierarchical federated learning
    Budania, Sumitra
    Kittur, Jeevan
    Shenoy, Meetha V.
    INTERNATIONAL JOURNAL OF EMBEDDED SYSTEMS, 2025, 18 (01)
  • [6] Adaptive Block-Wise Regularization and Knowledge Distillation for Enhancing Federated Learning
    Liu, Jianchun
    Zeng, Qingmin
    Xu, Hongli
    Xu, Yang
    Wang, Zhiyuan
    Huang, He
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2024, 32 (01) : 791 - 805
  • [7] Federated learning by employing knowledge distillation on edge devices with limited hardware resources
    Tanghatari, Ehsan
    Kamal, Mehdi
    Afzali-Kusha, Ali
    Pedram, Massoud
    NEUROCOMPUTING, 2023, 531 : 87 - 99
  • [8] Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning
    Xiucheng Wang
    Nan Cheng
    Longfei Ma
    Ruijin Sun
    Rong Chai
    Ning Lu
    China Communications, 2023, 20 (02) : 61 - 78
  • [9] Personalized and privacy-enhanced federated learning framework via knowledge distillation
    Yu, Fangchao
    Wang, Lina
    Zeng, Bo
    Zhao, Kai
    Yu, Rongwei
    NEUROCOMPUTING, 2024, 575
  • [10] Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning
    Wang, Xiucheng
    Cheng, Nan
    Ma, Longfei
    Sun, Ruijin
    Chai, Rong
    Lu, Ning
    CHINA COMMUNICATIONS, 2023, 20 (02) : 61 - 78