Addressing the Overfitting in Partial Domain Adaptation With Self-Training and Contrastive Learning

被引:9
|
作者
He, Chunmei [1 ,2 ]
Li, Xiuguang [1 ,2 ]
Xia, Yue [1 ,2 ]
Tang, Jing [1 ,2 ]
Yang, Jie [1 ,2 ]
Ye, Zhengchun [3 ]
机构
[1] Xiangtan Univ, Sch Comp Sci, Xiangtan 411105, Hunan, Peoples R China
[2] Xiangtan Univ, Sch Cyberspace Sci, Xiangtan 411105, Hunan, Peoples R China
[3] Xiangtan Univ, Sch Mech Engn, Xiangtan 411105, Hunan, Peoples R China
关键词
Entropy; Feature extraction; Reliability; Adaptation models; Training; Cyberspace; Computer science; Transfer learning; partial domain adaptation; deep neural network; image classification; contrastive learning;
D O I
10.1109/TCSVT.2023.3296617
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Partial domain adaptation (PDA) assumes that target domain class label set is a subset of that of source domain, while this problem setting is close to the actual scenario. At present, there are mainly two methods to solve the overfitting of source domain in PDA, namely the entropy minimization and the weighted self-training. However, the entropy minimization method may make the distribution prediction sharp but inaccurate for samples with relatively average prediction distribution, and cause the model to learn more error information. While the weighted self-training method will introduce erroneous noise information in the self-training process due to the existence of noise weights. Therefore, we address these issues in our work and propose self-training contrastive partial domain adaptation method (STCPDA). We present two modules to mine domain information in STCPDA. We first design self-training module based on simple samples in target domain to address the overfitting to source domain. We divide the target domain samples into simple samples with high reliability and difficult samples with low reliability, and the pseudo-labels of simple samples are selected for self-training learning. Then we construct the contrastive learning module for source and target domains. We embed contrastive learning into feature space of the two domains. By this contrastive learning module, we can fully explore the hidden information in all domain samples and make the class boundary more salient. Many experimental results on five datasets show the effectiveness and excellent classification performance of our method.
引用
收藏
页码:1532 / 1545
页数:14
相关论文
共 50 条
  • [1] Self-Training with Contrastive Learning for Adversarial Domain Adaptation
    Zhang, Xingyi (xyzhanghust@gmail.com), 1600, Institute of Electrical and Electronics Engineers Inc.
  • [2] Contrastive Learning and Self-Training for Unsupervised Domain Adaptation in Semantic Segmentation
    Marsden, Robert A.
    Bartler, Alexander
    Doebler, Mario
    Yang, Bin
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [3] Cycle Self-Training for Domain Adaptation
    Liu, Hong
    Wang, Jianmin
    Long, Mingsheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] DaMSTF: Domain Adversarial Learning Enhanced Meta Self-Training for Domain Adaptation
    Lu, Menglong
    Huang, Zhen
    Zhao, Yunxiang
    Tian, Zhiliang
    Liu, Yang
    Li, Dongsheng
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 1650 - 1668
  • [5] Understanding Self-Training for Gradual Domain Adaptation
    Kumar, Ananya
    Ma, Tengyu
    Liang, Percy
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [6] Unsupervised Domain Adaptation for Medical Image Segmentation by Disentanglement Learning and Self-Training
    Xie, Qingsong
    Li, Yuexiang
    He, Nanjun
    Ning, Munan
    Ma, Kai
    Wang, Guoxing
    Lian, Yong
    Zheng, Yefeng
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2024, 43 (01) : 4 - 14
  • [7] CSTN: A cross-region crop mapping method integrating self-training and contrastive domain adaptation
    Peng, Shuwen
    Zhang, Liqiang
    Xie, Rongchang
    Qu, Ying
    INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 2025, 136
  • [8] Unsupervised domain adaptation with self-training for weed segmentation
    Huang, Yingchao
    Hussein, Amina E.
    Wang, Xin
    Bais, Abdul
    Yao, Shanshan
    Wilder, Tanis
    INTELLIGENT SYSTEMS WITH APPLICATIONS, 2025, 25
  • [9] Adversarial Domain Adaptation Enhanced via Self-training
    Altinel, Fazil
    Akkaya, Ibrahim Batuhan
    29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,
  • [10] A Unified Contrastive Loss for Self-training
    Gauffre, Aurelien
    Horvat, Julien
    Amini, Massih-Reza
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-RESEARCH TRACK AND DEMO TRACK, PT VIII, ECML PKDD 2024, 2024, 14948 : 3 - 18