Prototype-Augmented Contrastive Learning for Few-Shot Unsupervised Domain Adaptation

被引:0
|
作者
Gong, Lu [1 ]
Zhang, Wen [1 ]
Li, Mingkang [1 ]
Zhang, Jiali [1 ]
Zhang, Zili [1 ]
机构
[1] Southwest Univ, Coll Comp & Informat Sci, Chongqing 400715, Peoples R China
来源
KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT IV, KSEM 2023 | 2023年 / 14120卷
关键词
Unsupervised domain adaptation; Self-supervised learning; Few-shot learning; Prototype learning; Contrastive learning;
D O I
10.1007/978-3-031-40292-0_17
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Unsupervised domain adaptation aims to learn a classification model from the source domain with much-supervised information, which is applied to the utterly unsupervised target domain. However, collecting enough labeled source samples is difficult in some scenarios, decreasing the effectiveness of previous approaches substantially. Therefore, a more challenging and applicable problem called few-shot unsupervised domain adaptation is considered in this work, where a classifier trained with only a few source labels needs to show strong generalization on the target domain. The prototype-based self-supervised learning method has presented superior performance improvements in addressing this problem, while the quality of the prototype could be further improved. To mitigate this situation, a novel Prototype-Augmented Contrastive Learning is proposed. A new computation strategy is utilized to rectify the source prototypes, which are then used to improve the target prototypes. To better learn semantic information and align features, both in-domain prototype contrastive learning and cross-domain prototype contrastive learning are performed. Extensive experiments are conducted on three widely used benchmarks: Office, OfficeHome, and DomainNet, achieving accuracy improvement of over 3%, 1%, and 0.5%, respectively, demonstrating the effectiveness of the proposed method.
引用
收藏
页码:197 / 210
页数:14
相关论文
共 50 条
  • [11] Contrastive prototype network with prototype augmentation for few-shot classification
    Jiang, Mengjuan
    Fan, Jiaqing
    He, Jiangzhen
    Du, Weidong
    Wang, Yansong
    Li, Fanzhang
    INFORMATION SCIENCES, 2025, 686
  • [12] Augmenting Few-Shot Learning With Supervised Contrastive Learning
    Lee, Taemin
    Yoo, Sungjoo
    IEEE ACCESS, 2021, 9 : 61466 - 61474
  • [13] Domain-Invariant Few-Shot Contrastive Learning for Hyperspectral Image Classification
    Chen, Wenchen
    Zhang, Yanmei
    Chu, Jianping
    Wang, Xingbo
    Applied Sciences (Switzerland), 2024, 14 (23):
  • [14] Few-shot object detection with semantic enhancement and semantic prototype contrastive learning
    Huang, Lian
    Dai, Shaosheng
    He, Ziqiang
    KNOWLEDGE-BASED SYSTEMS, 2022, 252
  • [15] Prototype Completion for Few-Shot Learning
    Zhang, Baoquan
    Li, Xutao
    Ye, Yunming
    Feng, Shanshan
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (10) : 12250 - 12268
  • [16] Supervised Contrastive Learning for Few-Shot Action Classification
    Han, Hongfeng
    Fei, Nanyi
    Lu, Zhiwu
    Wen, Ji-Rong
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT III, 2023, 13715 : 512 - 528
  • [17] A Contrastive learning-based Task Adaptation model for few-shot intent recognition
    Zhang, Xin
    Cai, Fei
    Hu, Xuejun
    Zheng, Jianming
    Chen, Honghui
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (03)
  • [18] Prototype Reinforcement for Few-Shot Learning
    Xu, Liheng
    Xie, Qian
    Jiang, Baoqing
    Zhang, Jiashuo
    2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 4912 - 4916
  • [19] Convert Cross-Domain Classification Into Few-Shot Learning: A Unified Prompt-Tuning Framework for Unsupervised Domain Adaptation
    Zhu, Yi
    Shen, Hui
    Li, Yun
    Qiang, Jipeng
    Yuan, Yunhao
    Wu, Xindong
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2025, 9 (01): : 810 - 821
  • [20] Discriminativeness-Preserved Domain Adaptation for Few-Shot Learning
    Liu, Guangzhen
    Lu, Zhiwu
    IEEE ACCESS, 2020, 8 : 168405 - 168413