Two-Level Privacy-Preserving Framework: Federated Learning for Attack Detection in the Consumer Internet of Things

被引:10
|
作者
Rabieinejad, Elnaz [1 ]
Yazdinejad, Abbas [1 ]
Dehghantanha, Ali [1 ]
Srivastava, Gautam [2 ,3 ,4 ]
机构
[1] Univ Guelph, Sch Comp Sci, Cyber Sci Lab, Canada Cyber Foundry, Guelph, ON N1H 6S1, Canada
[2] Brandon Univ, Dept Math & Comp Sci, Brandon, MB R7A 6A9, Canada
[3] China Med Univ, Res Ctr Interneural Comp, Taichung 40402, Taiwan
[4] Lebanese Amer Univ, Dept Comp Sci & Math, Beirut 1102, Lebanon
基金
加拿大自然科学与工程研究理事会;
关键词
Privacy; Security; Data privacy; Cryptography; Data models; Computational modeling; Servers; FL; PHE; ConsumerIoT; privacy; attack detection;
D O I
10.1109/TCE.2024.3349490
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
As the adoption of Consumer Internet of Things (CIoT) devices surges, so do concerns about security vulnerabilities and privacy breaches. Given their integration into daily life and data collection capabilities, it is crucial to safeguard user privacy against unauthorized access and potential leaks proactively. Federated learning, an advanced machine learning, provides a promising solution by inherently prioritizing privacy, circumventing the need for centralized data collection, and bolstering security. Yet, federated learning opens up avenues for adversaries to extract critical information from the machine learning model through data leakage and model inference attacks targeted at the central server. In response to this particular concern, we present an innovative two-level privacy-preserving framework in this paper. This framework synergistically combines federated learning with partially homomorphic encryption, which we favor over other methods such as fully homomorphic encryption and differential privacy. Our preference for partially homomorphic encryption is based on its superior balance between computational efficiency and model performance. This advantage becomes particularly relevant when considering the intense computational demands of fully homomorphic encryption and the sacrifice to model accuracy often associated with differential privacy. Incorporating partially homomorphic encryption augments federated learning's privacy assurance, introducing an additional protective layer. The fundamental properties of partially homomorphic encryption enable the central server to aggregate and compute operations on the encrypted local models without decryption, thereby preserving sensitive information from potential exposures. Empirical results substantiate the efficacy of the proposed framework, which significantly ameliorates attack prediction error rates and reduces false alarms compared to conventional methods. Moreover, through security analysis, we prove our proposed framework's enhanced privacy compared to existing methods that deploy federated learning for attack detection.
引用
收藏
页码:4258 / 4265
页数:8
相关论文
共 50 条
  • [31] Hercules: Boosting the Performance of Privacy-Preserving Federated Learning
    Xu, Guowen
    Han, Xingshuo
    Xu, Shengmin
    Zhang, Tianwei
    Li, Hongwei
    Huang, Xinyi
    Deng, Robert H.
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2023, 20 (05) : 4418 - 4433
  • [32] Visual Object Detection for Privacy-Preserving Federated Learning
    Zhang, Jing
    Zhou, Jiting
    Guo, Jinyang
    Sun, Xiaohan
    IEEE ACCESS, 2023, 11 : 33324 - 33335
  • [33] A Secure and Privacy-preserving Internet of Things Framework for Smart City
    Witti, Moussa
    Konstantas, Dimitri
    PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY: IOT AND SMART CITY (ICIT 2018), 2018, : 145 - 150
  • [34] Privacy-Preserving Traffic Flow Prediction: A Federated Learning Approach
    Liu, Yi
    Yu, James J. Q.
    Kang, Jiawen
    Niyato, Dusit
    Zhang, Shuyu
    IEEE INTERNET OF THINGS JOURNAL, 2020, 7 (08) : 7751 - 7763
  • [35] A Privacy-Preserving Social Computing Framework for Health Management Using Federated Learning
    Shen, Zhangyi
    Ding, Feng
    Yao, Ye
    Bhardwaj, Arpit
    Guo, Zhiwei
    Yu, Keping
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2023, 10 (04) : 1666 - 1678
  • [36] Communication-Efficient Personalized Federated Learning With Privacy-Preserving
    Wang, Qian
    Chen, Siguang
    Wu, Meng
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (02): : 2374 - 2388
  • [37] TAPFed: Threshold Secure Aggregation for Privacy-Preserving Federated Learning
    Xu, Runhua
    Li, Bo
    Li, Chao
    Joshi, James B. D.
    Ma, Shuai
    Li, Jianxin
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (05) : 4309 - 4323
  • [38] NSPFL: A Novel Secure and Privacy-Preserving Federated Learning With Data Integrity Auditing
    Zhang, Zehu
    Li, Yanping
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2024, 19 : 4494 - 4506
  • [39] Privacy-Preserving Incentive Mechanism Design for Federated Cloud-Edge Learning
    Liu, Tianyu
    Di, Boya
    An, Peng
    Song, Lingyang
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2021, 8 (03): : 2588 - 2600
  • [40] BSR-FL: An Efficient Byzantine-Robust Privacy-Preserving Federated Learning Framework
    Zeng, Honghong
    Li, Jie
    Lou, Jiong
    Yuan, Shijing
    Wu, Chentao
    Zhao, Wei
    Wu, Sijin
    Wang, Zhiwen
    IEEE TRANSACTIONS ON COMPUTERS, 2024, 73 (08) : 2096 - 2110