Edge-DPSDG: An Edge-Based Differential Privacy Protection Model for Smart Healthcare

被引:3
作者
Lyu, Moli [1 ,2 ]
Ni, Zhiwei [1 ,2 ]
Chen, Qian [1 ,2 ,3 ]
Li, Fenggang [1 ,2 ]
机构
[1] Hefei Univ Technol, Sch Management, Hefei 230009, Anhui, Peoples R China
[2] Minist Educ, Key Lab Proc Optimizat & Intelligent Decis Making, Hefei 230009, Peoples R China
[3] Hefei Univ Technol, Intelligent Interconnected Syst Lab Anhui Prov, Hefei 230009, Peoples R China
关键词
Data privacy; Privacy; Medical services; Differential privacy; Resource management; Distributed databases; Information entropy; Edge computing; smart healthcare; differential privacy; medical data; shapley value; information entropy; SECURE MULTIPARTY COMPUTATION; HOMOMORPHIC ENCRYPTION;
D O I
10.1109/TBDATA.2024.3366071
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The edge computing paradigm has revolutionized the healthcare sector, providing more real-time medical data processing and analysis, which also poses more serious privacy and security risks that must be carefully considered and addressed. Based on differential privacy, we presented an innovative privacy-preserving model named Edge-DPSDG (Edge-Differentially Private Synthetic Data Generator) for smart healthcare under edge computing. It also develops and evolves a privacy budget allocation mechanism. In a distributed environment, the privacy budget for local medical data is personalized by computing the Shapley value and the information entropy value of each attribute in the dataset, which takes into account the trade-off between data privacy and utility. Extensive experiments on three public medical datasets are performed to evaluate the performance of Edge-DPSDG on two metrics. For utility evaluation, Edge-DPSDG shows a best 21.29% accuracy improvement compared to the state-of-the-art; our privacy budget allocation mechanism improved existing models' accuracy by up to 6.05%. For privacy evaluation, Edge-DPSDG shows that can effectively ensure the privacy of the original datasets. In addition, Edge-DPSDG helps smooth the data, and results in a 3.99% accuracy loss decrease over the non-private model.
引用
收藏
页码:21 / 34
页数:14
相关论文
共 87 条
[1]   Deep Learning with Differential Privacy [J].
Abadi, Martin ;
Chu, Andy ;
Goodfellow, Ian ;
McMahan, H. Brendan ;
Mironov, Ilya ;
Talwar, Kunal ;
Zhang, Li .
CCS'16: PROCEEDINGS OF THE 2016 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY, 2016, :308-318
[2]   Federated learning and differential privacy for medical image analysis [J].
Adnan, Mohammed ;
Kalra, Shivam ;
Cresswell, Jesse C. ;
Taylor, Graham W. ;
Tizhoosh, Hamid R. .
SCIENTIFIC REPORTS, 2022, 12 (01)
[3]  
Alvim Mario S., 2012, Formal Aspects of Security and Trust. 8th International Workshop, FAST 2011. Revised Selected Papers, P39, DOI 10.1007/978-3-642-29420-4_3
[4]   Federated Learning for Healthcare: Systematic Review and Architecture Proposal [J].
Antunes, Rodolfo Stoffel ;
da Costa, Cristiano Andre ;
Kuederle, Arne ;
Yari, Imrana Abdullahi ;
Eskofier, Bjoern .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (04)
[5]   Quantum Information Dimension and Geometric Entropy [J].
Anza, Fabio ;
Crutchfield, James P. .
PRX QUANTUM, 2022, 3 (02)
[6]  
Bagdasaryan E., 2022, P PRIVACY ENHANCING, P162
[7]   Generalized and Multiple-Queries-Oriented Privacy Budget Strategies in Differential Privacy via Convergent Series [J].
Bai, Yunlu ;
Yang, Geng ;
Xiang, Yang ;
Wang, Xuan .
SECURITY AND COMMUNICATION NETWORKS, 2021, 2021
[8]  
Basciftci YO, 2016, 2016 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA)
[9]  
Beimel Amos, 2013, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques. Algorithms and Techniques. 16th International Workshop, APPROX 2013 and 17th International Workshop, RANDOM 2013. Proceedings: LNCS 8096, P363, DOI 10.1007/978-3-642-40328-6_26
[10]   Entropy and Information Theory: Uses and Misuses [J].
Ben-Naim, Arieh .
ENTROPY, 2019, 21 (12)