Secure and efficient parameters aggregation protocol for federated incremental learning and its applications

被引:10
作者
Wang, Xiaoying [1 ]
Liang, Zhiwei [2 ]
Koe, Arthur Sandor Voundi [2 ]
Wu, Qingwu [1 ]
Zhang, Xiaodong [1 ]
Li, Haitao [1 ]
Yang, Qintai [1 ]
机构
[1] Sun Yat Sen Univ, Affiliated Hosp 3, Guangzhou 510630, Peoples R China
[2] Guangzhou Univ, Inst Artificial Intelligence & Blockchain, Guangzhou, Peoples R China
关键词
edge computing; federated learning; machine learning; medical data; privacy protection;
D O I
10.1002/int.22727
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated Learning (FL) enables the deployment of distributed machine learning models over the cloud and Edge Devices (EDs) while preserving the privacy of sensitive local data, such as electronic health records. However, despite FL advantages regarding security and flexibility, current constructions still suffer from some limitations. Namely, heavy computation overhead on limited resources EDs, communication overhead in uploading converged local models' parameters to a centralized server for parameters aggregation, and lack of guaranteeing the acquired knowledge preservation in the face of incremental learning over new local data sets. This paper introduces a secure and resource-friendly protocol for parameters aggregation in federated incremental learning and its applications. In this study, the central server relies on a new method for parameters aggregation called orthogonal gradient aggregation. Such a method assumes constant changes of each local data set and allows updating parameters in the orthogonal direction of previous parameters spaces. As a result, our new construction is robust against catastrophic forgetting, maintains the federated neural network accuracy, and is efficient in computation and communication overhead. Moreover, extensive experiments analysis over several significant data sets for incremental learning demonstrates our new protocol's efficiency, efficacy, and flexibility.
引用
收藏
页码:4471 / 4487
页数:17
相关论文
共 39 条
[1]   Principal component analysis [J].
Abdi, Herve ;
Williams, Lynne J. .
WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2010, 2 (04) :433-459
[2]   Integration of logistic regression and multilayer perceptron for intelligent single and dual axis solar tracking systems [J].
AL-Rousan, Nadia ;
Isa, Nor Ashidi Mat ;
Desa, Mohammad Khairunaz Mat ;
AL-Najjar, Hazem .
INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2021, 36 (10) :5605-5669
[3]   Memory Aware Synapses: Learning What (not) to Forget [J].
Aljundi, Rahaf ;
Babiloni, Francesca ;
Elhoseiny, Mohamed ;
Rohrbach, Marcus ;
Tuytelaars, Tinne .
COMPUTER VISION - ECCV 2018, PT III, 2018, 11207 :144-161
[4]  
Alohali B. A., 2018, P 2018 11 INT S COMM, P1, DOI DOI 10.1109/CSNDSP.2018.8471871
[5]  
Beirami A., 2019, ARXIV PREPRINT ARXIV
[6]  
Bottou Leon., 2012, NEURAL NETWORKS TRIC, P421
[7]   APPM: Adaptive Parallel Processing Mechanism for Service Function Chains [J].
Cai, Jun ;
Huang, Zhongwei ;
Liao, Liping ;
Luo, Jianzhen ;
Liu, Wai-Xi .
IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2021, 18 (02) :1540-1555
[8]   Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation [J].
Chen, Yang ;
Sun, Xiaoyan ;
Jin, Yaochu .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (10) :4229-4238
[9]   A training-integrity privacy-preserving federated learning scheme with trusted execution environment [J].
Chen, Yu ;
Luo, Fang ;
Li, Tong ;
Xiang, Tao ;
Liu, Zheli ;
Li, Jin .
INFORMATION SCIENCES, 2020, 522 :69-79
[10]  
Dinh C., 2019, ARXIV PREPRINT ARXIV