Two-Level Privacy-Preserving Framework: Federated Learning for Attack Detection in the Consumer Internet of Things

被引:10
|
作者
Rabieinejad, Elnaz [1 ]
Yazdinejad, Abbas [1 ]
Dehghantanha, Ali [1 ]
Srivastava, Gautam [2 ,3 ,4 ]
机构
[1] Univ Guelph, Sch Comp Sci, Cyber Sci Lab, Canada Cyber Foundry, Guelph, ON N1H 6S1, Canada
[2] Brandon Univ, Dept Math & Comp Sci, Brandon, MB R7A 6A9, Canada
[3] China Med Univ, Res Ctr Interneural Comp, Taichung 40402, Taiwan
[4] Lebanese Amer Univ, Dept Comp Sci & Math, Beirut 1102, Lebanon
基金
加拿大自然科学与工程研究理事会;
关键词
Privacy; Security; Data privacy; Cryptography; Data models; Computational modeling; Servers; FL; PHE; ConsumerIoT; privacy; attack detection;
D O I
10.1109/TCE.2024.3349490
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
As the adoption of Consumer Internet of Things (CIoT) devices surges, so do concerns about security vulnerabilities and privacy breaches. Given their integration into daily life and data collection capabilities, it is crucial to safeguard user privacy against unauthorized access and potential leaks proactively. Federated learning, an advanced machine learning, provides a promising solution by inherently prioritizing privacy, circumventing the need for centralized data collection, and bolstering security. Yet, federated learning opens up avenues for adversaries to extract critical information from the machine learning model through data leakage and model inference attacks targeted at the central server. In response to this particular concern, we present an innovative two-level privacy-preserving framework in this paper. This framework synergistically combines federated learning with partially homomorphic encryption, which we favor over other methods such as fully homomorphic encryption and differential privacy. Our preference for partially homomorphic encryption is based on its superior balance between computational efficiency and model performance. This advantage becomes particularly relevant when considering the intense computational demands of fully homomorphic encryption and the sacrifice to model accuracy often associated with differential privacy. Incorporating partially homomorphic encryption augments federated learning's privacy assurance, introducing an additional protective layer. The fundamental properties of partially homomorphic encryption enable the central server to aggregate and compute operations on the encrypted local models without decryption, thereby preserving sensitive information from potential exposures. Empirical results substantiate the efficacy of the proposed framework, which significantly ameliorates attack prediction error rates and reduces false alarms compared to conventional methods. Moreover, through security analysis, we prove our proposed framework's enhanced privacy compared to existing methods that deploy federated learning for attack detection.
引用
收藏
页码:4258 / 4265
页数:8
相关论文
共 50 条
  • [21] AP2FL: Auditable Privacy-Preserving Federated Learning Framework for Electronics in Healthcare
    Yazdinejad, Abbas
    Dehghantanha, Ali
    Srivastava, Gautam
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 2527 - 2535
  • [22] Toward Privacy Preserving Federated Learning in Internet of Vehicular Things: Challenges and Future Directions
    Abdel-Basset, Mohamed
    Hawash, Hossam
    Moustafa, Nour
    IEEE CONSUMER ELECTRONICS MAGAZINE, 2022, 11 (06) : 56 - 66
  • [23] Privacy-Preserving Federated Deep Learning With Irregular Users
    Xu, Guowen
    Li, Hongwei
    Zhang, Yun
    Xu, Shengmin
    Ning, Jianting
    Deng, Robert H.
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2022, 19 (02) : 1364 - 1381
  • [24] A Verifiable Privacy-Preserving Federated Learning Framework Against Collusion Attacks
    Chen, Yange
    He, Suyu
    Wang, Baocang
    Feng, Zhanshen
    Zhu, Guanghui
    Tian, Zhihong
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2025, 24 (05) : 3918 - 3934
  • [25] A Privacy-Preserving Collaborative Federated Learning Framework for Detecting Retinal Diseases
    Gulati, Seema
    Guleria, Kalpna
    Goyal, Nitin
    Alzubi, Ahmad Ali
    Castilla, Angel Kuc
    IEEE ACCESS, 2024, 12 : 170176 - 170203
  • [26] Efficient Privacy-Preserving Federated Learning With Unreliable Users
    Li, Yiran
    Li, Hongwei
    Xu, Guowen
    Huang, Xiaoming
    Lu, Rongxing
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (13) : 11590 - 11603
  • [27] Privacy-Preserving Heterogeneous Personalized Federated Learning With Knowledge
    Pan, Yanghe
    Su, Zhou
    Ni, Jianbing
    Wang, Yuntao
    Zhou, Jinhao
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (06): : 5969 - 5982
  • [28] Personalized Privacy-Preserving Framework for Cross-Silo Federated Learning
    Tran, Van-Tuan
    Pham, Huy-Hieu
    Wong, Kok-Seng
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2024, 12 (04) : 1014 - 1024
  • [29] Staged Noise Perturbation for Privacy-Preserving Federated Learning
    Li, Zhe
    Chen, Honglong
    Gao, Yudong
    Ni, Zhichen
    Xue, Huansheng
    Shao, Huajie
    IEEE TRANSACTIONS ON SUSTAINABLE COMPUTING, 2024, 9 (06): : 936 - 947
  • [30] A Framework for Privacy-Preserving in IoV Using Federated Learning With Differential Privacy
    Adnan, Muhammad
    Syed, Madiha Haider
    Anjum, Adeel
    Rehman, Semeen
    IEEE ACCESS, 2025, 13 : 13507 - 13521