EzBoost: Fast And Secure Vertical Federated Tree Boosting Framework via EzPC

被引:0
作者
Gao, Xinwen [1 ]
Fu, Shaojing [1 ]
Liu, Lin [1 ]
Luo, Yuchuan [1 ]
Yang, Luming [1 ]
机构
[1] Natl Univ Def Technol, Coll Comp, Changsha, Peoples R China
来源
2023 IEEE 22ND INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS, TRUSTCOM, BIGDATASE, CSE, EUC, ISCI 2023 | 2024年
关键词
Vertical federated learning; Privacy-preserving; Tree boosting; Efficiency; Dual-servers;
D O I
10.1109/TrustCom60117.2023.00027
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) has emerged as a prominent methodology for collaboratively training machine learning models among multiple participants while alleviating data privacy leakage through data localization. However, recent studies have shown that the transferred intermediate parameters still contain sensitive information that needs to be further protected. Moreover, real-world institutions often possess diverse data attributes, necessitating the adoption of Vertical Federated Learning (VFL) for cooperative learning tasks. Existing researches in VFL have proposed some frameworks with privacy-preservation functionality, yet they suffer from high participant overhead or low model accuracy, etc. To address these challenges, in this paper, we propose EzBoost, a fast and secure vertical federated tree boosting framework built upon XGBoost. Specifically, we leverages the efficient Secure Multi-party Computation (MPC) framework, EzPC, to facilitate the design and implementation of EzBoost. By carefully designing our framework with two non-collusive servers for secure two-party computation, EzBoost significantly accelerates the runtime of model training and querying at most 20x, and reduces the participant overheads at most 300x. In addition, we identify a potential privacy leakage problem in recent researches and propose a more robust solution for addressing it. Through comprehensive security analysis and comparative experiments with existing approaches, we demonstrate that EzBoost achieves stronger privacy-preservation, higher accuracy and higher efficiency simultaneously.
引用
收藏
页码:28 / 37
页数:10
相关论文
共 26 条
[1]   A Survey on Homomorphic Encryption Schemes: Theory and Implementation [J].
Acar, Abbas ;
Aksu, Hidayet ;
Uluagac, A. Selcuk ;
Conti, Mauro .
ACM COMPUTING SURVEYS, 2018, 51 (04)
[2]  
[Anonymous], GEN DATA PROTECTION
[3]  
Boneh D., 2010, Paper 2010/543
[4]   EzPC: Programmable and Efficient Secure Two-Party Computation for Machine Learning [J].
Chandran, Nishanth ;
Gupta, Divya ;
Rastogi, Aseem ;
Sharma, Rahul ;
Tripathi, Shardul .
2019 4TH IEEE EUROPEAN SYMPOSIUM ON SECURITY AND PRIVACY (EUROS&P), 2019, :496-511
[5]   XGBoost: A Scalable Tree Boosting System [J].
Chen, Tianqi ;
Guestrin, Carlos .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :785-794
[6]   SecureBoost: A Lossless Federated Learning Framework [J].
Cheng, Kewei ;
Fan, Tao ;
Jin, Yilun ;
Liu, Yang ;
Chen, Tianjian ;
Papadopoulos, Dimitrios ;
Yang, Qiang .
IEEE INTELLIGENT SYSTEMS, 2021, 36 (06) :87-98
[7]  
Dwork C, 2006, LECT NOTES COMPUT SC, V4052, P1
[8]   VF2Boost: Very Fast Vertical Federated Gradient Boosting for Cross-Enterprise Learning [J].
Fu, Fangcheng ;
Shao, Yingxia ;
Yu, Lele ;
Jiang, Jiawei ;
Xue, Huanran ;
Tao, Yangyu ;
Cui, Bin .
SIGMOD '21: PROCEEDINGS OF THE 2021 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2021, :563-576
[9]   OpBoost: A Vertical Federated Tree Boosting Framework Based on Order-Preserving Desensitization [J].
Li, Xiaochen ;
Hu, Yuke ;
Liu, Weiran ;
Feng, Hanwen ;
Peng, Li ;
Hong, Yuan ;
Ren, Kui ;
Qin, Zhan .
PROCEEDINGS OF THE VLDB ENDOWMENT, 2022, 16 (02) :202-215
[10]  
McMahan HB, 2017, PR MACH LEARN RES, V54, P1273