PEFL: Privacy-Preserved and Efficient Federated Learning With Blockchain

被引:0
作者
Tian, Lei [1 ,2 ]
Lin, Feilong [1 ,2 ]
Gan, Jiahao [1 ,2 ]
Jia, Riheng [1 ,2 ]
Zheng, Zhonglong [1 ,2 ]
Li, Minglu [1 ,2 ]
机构
[1] Zhejiang Normal Univ, Key Lab Intelligent Educ Technol & Applicat Zheji, Jinhua 321004, Peoples R China
[2] Zhejiang Normal Univ, Coll Comp Sci & Technol, Jinhua 321004, Peoples R China
来源
IEEE INTERNET OF THINGS JOURNAL | 2025年 / 12卷 / 03期
关键词
Computational modeling; Blockchains; Training; Servers; Data models; Security; Privacy; Protection; Federated learning; Internet of Things; Blockchain; consensus mechanism; differential privacy (DP); federated learning (FL); privacy security;
D O I
10.1109/JIOT.2024.3479328
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the rise of federated learning (FL) in the realm of machine learning for data privacy protection, its unique distributed data processing characteristics have garnered widespread attention. However, the implementation of FL faces many challenges, as achieving a balance between data privacy, model security, and system efficiency is difficult, often requiring the sacrifice of efficiency for privacy and security. Moreover, this process typically assumes the existence of a trusted server for coordination. Addressing these challenges, this article proposes a privacy-preserved and efficient FL framework with blockchain (PEFL). PEFL utilizes blockchain and differential privacy techniques to coordinate privacy protection among clients, and filters out anomalous model parameters through an aggregation-side detection algorithm to resist poisoning attacks. Under the assumption of an untrusted server, we design the model-validated fault-tolerant federation (MFF) consensus mechanism based on a committee, balancing efficiency expectations to regulate the server and ensure the reliability of the training process. Through experiments on the MNIST and CIFAR10 datasets, and comparison with typical FL schemes, PEFL demonstrates better defense against various attack models. Besides, it achieves higher training efficiency while ensuring privacy security.
引用
收藏
页码:3305 / 3317
页数:13
相关论文
共 31 条
[1]   Secure Single-Server Aggregation with (Poly)Logarithmic Overhead [J].
Bell, James Henry ;
Bonawitz, Kallista A. ;
Gascon, Adria ;
Lepoint, Tancrede ;
Raykova, Mariana .
CCS '20: PROCEEDINGS OF THE 2020 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY, 2020, :1253-1269
[2]  
Bitcoin NS, 2008, BITCOIN PEER TO PEER, DOI DOI 10.2139/SSRN.3440802
[3]  
Blanchard P, 2017, ADV NEUR IN, V30
[4]   Toward On-Device Federated Learning: A Direct Acyclic Graph-Based Blockchain Approach [J].
Cao, Mingrui ;
Zhang, Long ;
Cao, Bin .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (04) :2028-2042
[5]  
Carlini N, 2022, P IEEE S SECUR PRIV, P1897, DOI [10.1109/SP46214.2022.00090, 10.1109/SP46214.2022.9833649]
[6]   A Decentralized Federated Learning Framework via Committee Mechanism With Convergence Guarantee [J].
Che, Chunjiang ;
Li, Xiaoli ;
Chen, Chuan ;
He, Xiaoyu ;
Zheng, Zibin .
IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (12) :4783-4800
[8]   Calibrating noise to sensitivity in private data analysis [J].
Dwork, Cynthia ;
McSherry, Frank ;
Nissim, Kobbi ;
Smith, Adam .
THEORY OF CRYPTOGRAPHY, PROCEEDINGS, 2006, 3876 :265-284
[9]  
Fang MH, 2020, PROCEEDINGS OF THE 29TH USENIX SECURITY SYMPOSIUM, P1623
[10]  
Hsu TMH, 2019, Arxiv, DOI arXiv:1909.06335