Preserving Data Utility in Differentially Private Smart Home Data

被引:0
|
作者
Stirapongsasuti, Sopicha [1 ]
Tiausas, Francis Jerome [1 ]
Nakamura, Yugo [2 ]
Yasumoto, Keiichi [1 ,3 ]
机构
[1] Nara Inst Sci & Technol, Ikoma, Nara 6300192, Japan
[2] Kyushu Univ, Dept Informat Sci & Elect Engn, Fukuoka 8190395, Japan
[3] RIKEN, Ctr Adv Intelligence Project AIP, Tokyo 1030027, Japan
关键词
Differential privacy; machine learning; privacy; smart home; PRESERVATION; EFFICIENT; SYSTEM; CARE;
D O I
10.1109/ACCESS.2024.3390039
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The development of smart sensors and appliances can provide a lot of services. Nevertheless, the act of aggregating data containing sensitive information related to privacy in a single location poses significant issues. Such information can be misused by a malicious attacker. Also, some previous studies attempted to apply privacy mechanisms, but they decreased data utility. In this paper, we propose privacy protection mechanisms to preserve privacy-sensitive sensor data generated in a smart home. We leverage R & eacute;nyi differential privacy (RDP) to preserve privacy. However, the preliminary result showed that using only RDP still significantly decreases the utility of data. Thus, a novel scheme called feature merging anonymization (FMA) is proposed to preserve privacy while maintaining data utility by merging feature dataframes of the same activities from other homes. Also, the expected trade-off is defined so that data utility should be greater than the privacy preserved. To evaluate the proposed techniques, we define privacy preservation and data utility as inverse accuracy of person identification (PI) and accuracy of activity recognition (AR), respectively. We trained the AR and PI models for two cases with and without FMA, using 2 smart-home open datasets i.e. the HIS and Toyota dataset. As a result, we could lower the accuracy of PI in the HIS and Toyota dataset to 73.85% and 41.18% with FMA respectively compared to 100% without FMA, while maintaining the accuracy of AR at 94.62% and 87.3% with FMA compared to 98.58% and 89.28% without FMA in the HIS and Toyota dataset, respectively. Another experiment was conducted to explore the feasibility of implementing FMA in a local server by partially merging frames of the original activity with frames of other activities at different merging ratios. The results show that the local server can still satisfy the expected trade-off at some ratios.
引用
收藏
页码:56571 / 56581
页数:11
相关论文
共 50 条
  • [31] Differentially private multidimensional data publishing
    Khalil Al-Hussaeni
    Benjamin C. M. Fung
    Farkhund Iqbal
    Junqiang Liu
    Patrick C. K. Hung
    Knowledge and Information Systems, 2018, 56 : 717 - 752
  • [32] Preserving Data Integrity for Smart Grid Data Aggregation
    Li, Fengjun
    Luo, Bo
    2012 IEEE THIRD INTERNATIONAL CONFERENCE ON SMART GRID COMMUNICATIONS (SMARTGRIDCOMM), 2012, : 366 - 371
  • [33] Utility-Preserving Face Anonymization via Differentially Private Feature Operations
    Li, Chengqi
    Simionescu, Sarah
    He, Wenbo
    Qiao, Sanzheng
    Kara, Nadjia
    Talhi, Chamseddine
    IEEE INFOCOM 2024-IEEE CONFERENCE ON COMPUTER COMMUNICATIONS, 2024, : 2279 - 2288
  • [34] Bridging unlinkability and data utility: Privacy preserving data publication schemes for healthcare informatics
    Chong, Kah Meng
    Malip, Amizah
    COMPUTER COMMUNICATIONS, 2022, 191 : 194 - 207
  • [35] Differentially Private Data Sets Based on Microaggregation and Record Perturbation
    Soria-Comas, Jordi
    Domingo-Ferrer, Josep
    MODELING DECISIONS FOR ARTIFICIAL INTELLIGENCE (MDAI 2017), 2017, 10571 : 119 - 131
  • [36] Incremental release of differentially-private check-in data
    Riboni, Daniele
    Bettini, Claudio
    PERVASIVE AND MOBILE COMPUTING, 2015, 16 : 220 - 238
  • [37] Examining the Utility of Differentially Private Synthetic Data Generated using Variational Autoencoder with TensorFlow Privacy
    Tai, Bo-Chen
    Li, Szu-Chuang
    Huang, Yennun
    Wang, Pang-Chieh
    2022 IEEE 27TH PACIFIC RIM INTERNATIONAL SYMPOSIUM ON DEPENDABLE COMPUTING (PRDC), 2022, : 236 - 241
  • [38] An Optimized Privacy-Utility Tradeoff Framework for Differentially Private Data Sharing in Blockchain-Based Internet of Things
    Islam, Muhammad
    Rehmani, Mubashir Husain
    Gao, Longxiang
    Chen, Jinjun
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (07): : 7778 - 7792
  • [39] Privacy-preserving data infrastructure for smart home appliances based on the Octopus DHT
    Fabian, Benjamin
    Feldhaus, Tobias
    COMPUTERS IN INDUSTRY, 2014, 65 (08) : 1147 - 1160
  • [40] A differentially private algorithm for location data release
    Xiong, Ping
    Zhu, Tianqing
    Niu, Wenjia
    Li, Gang
    KNOWLEDGE AND INFORMATION SYSTEMS, 2016, 47 (03) : 647 - 669