FedUTN: federated self-supervised learning with updating target network

被引:0
作者
Simou Li
Yuxing Mao
Jian Li
Yihang Xu
Jinsen Li
Xueshuo Chen
Siyang Liu
Xianping Zhao
机构
[1] Chongqing University,State Key Laboratory of Power Transmission Equipment & System Security and New Technology
[2] Yunnan Power Grid Co.,Electric Power Research Institute
[3] Ltd,undefined
来源
Applied Intelligence | 2023年 / 53卷
关键词
Computer vision; Self-supervised learning; Federated learning; Federated self-supervised learning;
D O I
暂无
中图分类号
学科分类号
摘要
Self-supervised learning (SSL) is capable of learning noteworthy representations from unlabeled data, which has mitigated the problem of insufficient labeled data to a certain extent. The original SSL method centered on centralized data, but the growing awareness of privacy protection restricts the sharing of decentralized, unlabeled data generated by a variety of mobile devices, such as cameras, phones, and other terminals. Federated Self-supervised Learning (FedSSL) is the result of recent efforts to create Federated learning, which is always used for supervised learning using SSL. Informed by past work, we propose a new FedSSL framework, FedUTN. This framework aims to permit each client to train a model that works well on both independent and identically distributed (IID) and independent and non-identically distributed (non-IID) data. Each party possesses two asymmetrical networks, a target network and an online network. FedUTN first aggregates the online network parameters of each terminal and then updates the terminals’ target network with the aggregated parameters, which is a radical departure from the update technique utilized in earlier studies. In conjunction with this method, we offer a novel control algorithm to replace EMA for the training operation. After extensive trials, we demonstrate that: (1) the feasibility of utilizing the aggregated online network to update the target network. (2) FedUTN’s aggregation strategy is simpler, more effective, and more robust. (3) FedUTN outperforms all other prevalent FedSSL algorithms and outperforms the SOTA algorithm by 0.5%∼\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\sim $\end{document} 1.6% under regular experiment con1ditions.
引用
收藏
页码:10879 / 10892
页数:13
相关论文
共 26 条
[1]  
Grill J-B(2020)Bootstrap your own latent-a new approach to self-supervised learning Adv Neural Inf Process Syst 33 21271-21284
[2]  
Strub F(2020)Self-balancing federated learning with global imbalanced data in mobile systems IEEE Trans Parallel Distrib Syst 32 59-71
[3]  
Altché F(2019)Language models are unsupervised multitask learners OpenAI blog 1 9-450
[4]  
Tallec C(2020)Federated optimization in heterogeneous networks Proc Mach Learn Syst 2 429-undefined
[5]  
Richemond P(undefined)undefined undefined undefined undefined-undefined
[6]  
Buchatskaya E(undefined)undefined undefined undefined undefined-undefined
[7]  
Doersch C(undefined)undefined undefined undefined undefined-undefined
[8]  
Avila Pires B(undefined)undefined undefined undefined undefined-undefined
[9]  
Guo Z(undefined)undefined undefined undefined undefined-undefined
[10]  
Duan M(undefined)undefined undefined undefined undefined-undefined