Boosting semi-supervised learning with Contrastive Complementary Labeling

被引:3
作者
Deng, Qinyi [1 ]
Guo, Yong [1 ]
Yang, Zhibang [1 ]
Pan, Haolin [1 ]
Chen, Jian [1 ]
机构
[1] South China Univ Technol, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Semi-supervised learning; Contrastive learning; Complementary labels;
D O I
10.1016/j.neunet.2023.11.052
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Semi-supervised learning (SSL) approaches have achieved great success in leveraging a large amount of unlabeled data to learn deep models. Among them, one popular approach is pseudo-labeling which generates pseudo labels only for those unlabeled data with high-confidence predictions. As for the low-confidence ones, existing methods often simply discard them because these unreliable pseudo labels may mislead the model. Unlike existing methods, we highlight that these low-confidence data can be still beneficial to the training process. Specifically, although we cannot determine which class a low-confidence sample belongs to, we can assume that this sample should be very unlikely to belong to those classes with the lowest probabilities (often called complementary classes/labels). Inspired by this, we propose a novel Contrastive Complementary Labeling (CCL) method that constructs a large number of reliable negative pairs based on the complementary labels and adopts contrastive learning to make use of all the unlabeled data. Extensive experiments demonstrate that CCL significantly improves the performance on top of existing advanced methods and is particularly effective under the label-scarce settings. For example, CCL yields an improvement of 2.43% over FixMatch on CIFAR-10 only with 40 labeled data.
引用
收藏
页码:417 / 426
页数:10
相关论文
共 65 条
  • [11] Coates A., 2011, JMLR WORKSHOP C P, P215
  • [12] Randaugment: Practical automated data augmentation with a reduced search space
    Cubuk, Ekin D.
    Zoph, Barret
    Shlens, Jonathon
    Le, Quoc, V
    [J]. 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 3008 - 3017
  • [13] Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848
  • [14] MutexMatch: Semi-Supervised Learning With Mutex-Based Consistency Regularization
    Duan, Yue
    Zhao, Zhen
    Qi, Lei
    Wang, Lei
    Zhou, Luping
    Shi, Yinghuan
    Gao, Yang
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (06) : 8441 - 8455
  • [15] Revisiting Consistency Regularization for Semi-Supervised Learning
    Fan, Yue
    Kukleva, Anna
    Dai, Dengxin
    Schiele, Bernt
    [J]. INTERNATIONAL JOURNAL OF COMPUTER VISION, 2023, 131 (03) : 626 - 643
  • [16] Feng L, 2019, 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019)
  • [17] Grill J.-B., 2020, ARXIV
  • [18] Gunel B., 2021, INT C LEARN REPR
  • [19] Gutmann M., 2010, P 13 INT C ART INT S, P297
  • [20] Meta Pseudo Labels
    Hieu Pham
    Dai, Zihang
    Xie, Qizhe
    Le, Quoc, V
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 11552 - 11563