Differentially Private Analysis of Outliers

被引:8
|
作者
Okada, Rina [1 ]
Fukuchi, Kazuto [1 ]
Sakuma, Jun [1 ]
机构
[1] Univ Tsukuba, Tsukuba, Ibaraki 3058577, Japan
来源
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2015, PT II | 2015年 / 9285卷
关键词
Differential privacy; Outlier detection; Smooth sensitivity; NOISE;
D O I
10.1007/978-3-319-23525-7_28
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents an investigation of differentially private analysis of distance-based outliers. Outlier detection aims to identify instances that are apparently distant from other instances. Meanwhile, the objective of differential privacy is to conceal the presence (or absence) of any particular instance. Outlier detection and privacy protection are therefore intrinsically conflicting tasks. In this paper, we present differentially private queries for counting outliers that appear in a given subspace, instead of reporting the outliers detected. Our analysis of the global sensitivity of outlier counts reveals that regular global sensitivity-based methods can make the outputs too noisy, particularly when the dimensionality of the given subspace is high. Noting that the counts of outliers are typically expected to be small compared to the number of data, we introduce a mechanism based on the smooth upper bound of the local sensitivity. This study is the first trial to ensure differential privacy for distance-based outlier analysis. The experimentally obtained results show that our method achieves better utility than global sensitivity-based methods do.
引用
收藏
页码:458 / 473
页数:16
相关论文
共 50 条
  • [41] Differentially Private Kalman Filtering
    Le Ny, Jerome
    Pappas, George J.
    2012 50TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2012, : 1618 - 1625
  • [42] A Survey of Differentially Private Regression for Clinical and Epidemiological Research
    Ficek, Joseph
    Wang, Wei
    Chen, Henian
    Dagne, Getachew
    Daley, Ellen
    INTERNATIONAL STATISTICAL REVIEW, 2021, 89 (01) : 132 - 147
  • [43] Differentially Private Empirical Risk Minimization with Input Perturbation
    Fukuchi, Kazuto
    Quang Khai Tran
    Sakuma, Jun
    DISCOVERY SCIENCE, DS 2017, 2017, 10558 : 82 - 90
  • [44] Differentially Private Distributed Learning
    Zhou, Yaqin
    Tang, Shaojie
    INFORMS JOURNAL ON COMPUTING, 2020, 32 (03) : 779 - 789
  • [45] Differentially Private Resource Allocation
    Chen, Joann Qiongna
    Wang, Tianhao
    Zhang, Zhikun
    Zhang, Yang
    Jha, Somesh
    Li, Zhou
    39TH ANNUAL COMPUTER SECURITY APPLICATIONS CONFERENCE, ACSAC 2023, 2023, : 772 - 786
  • [46] DPETs: A Differentially Private ExtraTrees
    Zhang, ChunMei
    Li, Yang
    Chen, ZiBin
    2017 13TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND SECURITY (CIS), 2017, : 302 - 306
  • [47] Differentially Private Approximate Quantiles
    Kaplan, Haim
    Schnapp, Shachar
    Stemmer, Uri
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022, : 10751 - 10761
  • [48] Differentially Private Maximum Consensus
    Wang, Xin
    He, Jianping
    Cheng, Peng
    Chen, Jiming
    IFAC PAPERSONLINE, 2017, 50 (01): : 9509 - 9514
  • [49] Proving that Programs Are Differentially Private
    McIver, Annabelle
    Morgan, Carroll
    PROGRAMMING LANGUAGES AND SYSTEMS, APLAS 2019, 2019, 11893 : 3 - 18
  • [50] Differentially Private Friends Recommendation
    Macwan, Kamalkumar
    Imine, Abdessamad
    Rusinowitch, Michael
    FOUNDATIONS AND PRACTICE OF SECURITY, FPS 2022, 2023, 13877 : 236 - 251