ASCFL: Accurate and Speedy Semi-Supervised Clustering Federated Learning

被引:5
|
作者
He, Jingyi [1 ]
Gong, Biyao [1 ]
Yang, Jiadi [1 ]
Wang, Hai [1 ]
Xu, Pengfei [1 ]
Xing, Tianzhang [1 ,2 ]
机构
[1] Northwest Univ, Sch Informat Sci & Technol, Xian 710100, Peoples R China
[2] Northwest Univ, Internet Things Res Ctr, Xian 710100, Peoples R China
来源
TSINGHUA SCIENCE AND TECHNOLOGY | 2023年 / 28卷 / 05期
基金
中国国家自然科学基金;
关键词
federated learning; clustered federated learning; non-Independent Identically Distribution (non-IID) data; similarity indicator; client selection; semi-supervised learning;
D O I
10.26599/TST.2022.9010057
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The influence of non-Independent Identically Distribution (non-IID) data on Federated Learning (FL) has been a serious concern. Clustered Federated Learning (CFL) is an emerging approach for reducing the impact of non-IID data, which employs the client similarity calculated by relevant metrics for clustering. Unfortunately, the existing CFL methods only pursue a single accuracy improvement, but ignore the convergence rate. Additionlly, the designed client selection strategy will affect the clustering results. Finally, traditional semi-supervised learning changes the distribution of data on clients, resulting in higher local costs and undesirable performance. In this paper, we propose a novel CFL method named ASCFL, which selects clients to participate in training and can dynamically adjust the balance between accuracy and convergence speed with datasets consisting of labeled and unlabeled data. To deal with unlabeled data, the prediction labels strategy predicts labels by encoders. The client selection strategy is to improve accuracy and reduce overhead by selecting clients with higher losses participating in the current round. What is more, the similarity-based clustering strategy uses a new indicator to measure the similarity between clients. Experimental results show that ASCFL has certain advantages in model accuracy and convergence speed over the three state-of-the-art methods with two popular datasets.
引用
收藏
页码:823 / 837
页数:15
相关论文
共 50 条
  • [41] A Knowledge Transfer-Based Semi-Supervised Federated Learning for IoT Malware Detection
    Pei, Xinjun
    Deng, Xiaoheng
    Tian, Shengwei
    Zhang, Lan
    Xue, Kaiping
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2023, 20 (03) : 2127 - 2143
  • [42] CROSS-SILO FEDERATED TRAINING IN THE CLOUD WITH DIVERSITY SCALING AND SEMI-SUPERVISED LEARNING
    Nandury, Kishore
    Mohan, Anand
    Weber, Frederick
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3085 - 3089
  • [43] Dual Class-Aware Contrastive Federated Semi-Supervised Learning
    Guo, Qi
    Wu, Di
    Qi, Yong
    Qi, Saiyu
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2025, 24 (02) : 1073 - 1089
  • [44] METALS : seMi-supervised fEderaTed Active Learning for intrusion detection Systems
    Aouedi, Ons
    Jajoo, Gautam
    Piamrat, Kandaraj
    2024 IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS, ISCC 2024, 2024,
  • [45] Two-stage sampling with predicted distribution changes in federated semi-supervised learning
    Zhu, Suxia
    Ma, Xing
    Sun, Guanglu
    KNOWLEDGE-BASED SYSTEMS, 2024, 295
  • [46] On semi-supervised learning
    Cholaquidis, A.
    Fraiman, R.
    Sued, M.
    TEST, 2020, 29 (04) : 914 - 937
  • [47] On semi-supervised learning
    A. Cholaquidis
    R. Fraiman
    M. Sued
    TEST, 2020, 29 : 914 - 937
  • [48] Semi-supervised clustering with deep metric learning and graph embedding
    Xiaocui Li
    Hongzhi Yin
    Ke Zhou
    Xiaofang Zhou
    World Wide Web, 2020, 23 : 781 - 798
  • [49] Semi-supervised clustering with deep metric learning and graph embedding
    Li, Xiaocui
    Yin, Hongzhi
    Zhou, Ke
    Zhou, Xiaofang
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2020, 23 (02): : 781 - 798
  • [50] Ensemble Knowledge Distillation for Federated Semi-Supervised Image Classification
    Shang, Ertong
    Liu, Hui
    Zhang, Jingyang
    Zhao, Runqi
    Du, Junzhao
    TSINGHUA SCIENCE AND TECHNOLOGY, 2025, 30 (01): : 112 - 123