DPWSS: differentially private working set selection for training support vector machines

被引:2
作者
Sun, Zhenlong [1 ,2 ]
Yang, Jing [1 ]
Li, Xiaoye [2 ]
Zhang, Jianpei [1 ]
机构
[1] Harbin Engn Univ, Coll Comp Sci & Technol, Harbin, Heilongjiang, Peoples R China
[2] Qiqihar Univ, Coll Comp & Control Engn, Qiqihar, Heilongjiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Differential privacy; Exponential mechanism; Sequential minimal optimization; Support vector machines; Working set selection; SMO ALGORITHM; CONVERGENCE;
D O I
10.7717/peerj-cs.799
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Support vector machine (SVM) is a robust machine learning method and is widely used in classification. However, the traditional SVM training methods may reveal personal privacy when the training data contains sensitive information. In the training process of SVMs, working set selection is a vital step for the sequential minimal optimization-type decomposition methods. To avoid complex sensitivity analysis and the influence of high-dimensional data on the noise of the existing SVM classifiers with privacy protection, we propose a new differentially private working set selection algorithm (DPWSS) in this paper, which utilizes the exponential mechanism to privately select working sets. We theoretically prove that the proposed algorithm satisfies differential privacy. The extended experiments show that the DPWSS algorithm achieves classification capability almost the same as the original non-privacy SVM under different parameters. The errors of optimized objective value between the two algorithms are nearly less than two, meanwhile, the DPWSS algorithm has a higher execution efficiency than the original non-privacy SVM by comparing iterations on different datasets. To the best of our knowledge, DPWSS is the first private working set selection algorithm based on differential privacy.
引用
收藏
页数:36
相关论文
共 50 条
[31]   An empirical study of feature selection in support vector machines [J].
Cao, L. J. ;
Zhang Jingqing .
NEURAL NETWORK WORLD, 2006, 16 (05) :433-453
[32]   Fast training of support vector machines on the Cell processor [J].
Marzolla, Moreno .
NEUROCOMPUTING, 2011, 74 (17) :3700-3707
[33]   Highlighting heterogeneous samples to support vector machines' training [J].
Yang, Chan-Yun .
NEUROCOMPUTING, 2008, 72 (1-3) :218-230
[34]   Incremental training of support vector machines using hyperspheres [J].
Katagiri, Shinya ;
Abe, Shigeo .
PATTERN RECOGNITION LETTERS, 2006, 27 (13) :1495-1507
[35]   Simplify Support Vector Machines by Iterative Learning and Constructing Reduced Vector Set [J].
Zhang, Peng ;
Liu, Litao .
2011 INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTATION AND INDUSTRIAL APPLICATION (ICIA2011), VOL II, 2011, :71-75
[36]   Simplify Support Vector Machines by Iterative Learning and Constructing Reduced Vector Set [J].
Zhang, Peng ;
Liu, Litao .
2010 THE 3RD INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND INDUSTRIAL APPLICATION (PACIIA2010), VOL VII, 2010, :71-75
[37]   Integrally Private Model Selection for Support Vector Machine [J].
Kwatra, Saloni ;
Varshney, Ayush K. ;
Torra, Vicenc C. .
COMPUTER SECURITY. ESORICS 2023 INTERNATIONAL WORKSHOPS, PT I, 2024, 14398 :249-259
[38]   FRSVMs: Fuzzy rough set based support vector machines [J].
Chen, Degang ;
He, Qiang ;
Wang, Xizhao .
FUZZY SETS AND SYSTEMS, 2010, 161 (04) :596-607
[39]   Simultaneous Support Vector Selection and Parameter Optimization Using Support Vector Machines for Sentiment Classification [J].
Fei, Ye .
PROCEEDINGS OF 2016 IEEE 7TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING AND SERVICE SCIENCE (ICSESS 2016), 2016, :59-62
[40]   ADAPTIVE LEARNING RATES FOR SUPPORT VECTOR MACHINES WORKING ON DATA WITH LOW INTRINSIC DIMENSION [J].
Hamm, Thomas ;
Steinwart, Ingo .
ANNALS OF STATISTICS, 2021, 49 (06) :3153-3180