Memory-Enhanced Confidence Calibration for Class-Incremental Unsupervised Domain Adaptation

被引:0
|
作者
Yu, Jiaping [1 ]
Yang, Muli [2 ]
Wu, Aming [1 ]
Deng, Cheng [1 ]
机构
[1] Xidian Univ, Sch Elect Engn, Xian 710071, Peoples R China
[2] ASTAR, Inst Infocomm Res I2R, Singapore 138632, Singapore
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Data models; Training; Adaptation models; Calibration; Feature extraction; Character recognition; Accuracy; Tail; Predictive models; Incremental learning; Unsupervised domain adaptation; class incremental learning; image recognition; causality;
D O I
10.1109/TMM.2024.3521834
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we focus on Class-Incremental Unsupervised Domain Adaptation (CI-UDA), where the labeled source domain already includes all classes, and the classes in the unlabeled target domain emerge sequentially over time. This task involves addressing two main challenges. The first is the domain gap between the labeled source data and the unlabeled target data, which leads to weak generalization performance. The second is the inconsistency between the source and target category spaces at each time step, which causes catastrophic forgetting during the testing stage. Previous methods focus solely on the alignment of similar samples from different domains, which overlooks the underlying causes of the domain gap/class distribution difference. To tackle the issue, we rethink this task from a causal perspective for the first time. We first build a structural causal graph to describe the CI-UDA problem. Based on the causal graph, we present Memory-Enhanced Confidence Calibration (MECC), which aims to improve confidence in the predicted results. In particular, we argue that the domain discrepancy caused by the different styles is prone to make the model produce less confident predictions and thus weakens the generalization and continual learning abilities. To this end, we first explore using the gram matrix to generate source-style target data, which is combined with the original data to jointly train the model and thereby reduce the domain-shift impact. Second, we utilize the model of the previous time step to select corresponding samples that are used to build a memory bank, which is instrumental in alleviating catastrophic forgetting. Extensive experimental results on multiple datasets demonstrate the superiority of our method.
引用
收藏
页码:610 / 621
页数:12
相关论文
共 50 条
  • [31] Unsupervised domain adaptation via progressive positioning of target-class prototypes
    Du, Yongjie
    Zhou, Ying
    Xie, Yu
    Zhou, Deyun
    Shi, Jiao
    Lei, Yu
    KNOWLEDGE-BASED SYSTEMS, 2023, 273
  • [32] Unsupervised domain adaptation of dynamic extension networks based on class decision boundaries
    Yuanjiao Chen
    Diao Wang
    Darong Zhu
    Zhe Xu
    Bishi He
    Multimedia Systems, 2024, 30
  • [33] Unsupervised domain adaptation of dynamic extension networks based on class decision boundaries
    Chen, Yuanjiao
    Wang, Diao
    Zhu, Darong
    Xu, Zhe
    He, Bishi
    MULTIMEDIA SYSTEMS, 2024, 30 (02)
  • [34] Per-class curriculum for Unsupervised Domain Adaptation in semantic segmentationPer-class curriculum for Unsupervised Domain Adaptation in semantic...R. Alcover-Couso et al.
    Roberto Alcover-Couso
    Juan C. SanMiguel
    Marcos Escudero-Viñolo
    Pablo Carballeira
    The Visual Computer, 2025, 41 (2) : 901 - 919
  • [35] Prototype and Context-Enhanced Learning for Unsupervised Domain Adaptation Semantic Segmentation of Remote Sensing Images
    Gao, Kuiliang
    Yu, Anzhu
    You, Xiong
    Qiu, Chunping
    Liu, Bing
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [36] Dual-Domain Feature Fusion and Multi-Level Memory-Enhanced Network for Spectral Compressive Imaging
    Ying, Yangke
    Wang, Jin
    Shi, Yunhui
    Ling, Nam
    Yin, Baocai
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (10) : 9562 - 9577
  • [37] Unsupervised Domain Adaptation via Contrastive Learning and Complementary Region-Class Mixing
    Li, Xiaojing
    Zhou, Wei
    Jiang, Mingjian
    IEEE Access, 2024, 12 : 193284 - 193298
  • [38] Class-Aware Distribution Alignment based Unsupervised Domain Adaptation for Speaker Verification
    Hu, Hang-Rui
    Song, Yan
    Dai, Li-Rong
    McLoughlin, Ian
    Liu, Lin
    INTERSPEECH 2022, 2022, : 3689 - 3693
  • [39] Attention-Guided Optimal Transport for Unsupervised Domain Adaptation with Class Structure Prior
    Li, Ying
    Zhu, Yanan
    Ying, Shihui
    NEURAL PROCESSING LETTERS, 2023, 55 (09) : 12547 - 12567
  • [40] Attention-Guided Optimal Transport for Unsupervised Domain Adaptation with Class Structure Prior
    Ying Li
    Yanan Zhu
    Shihui Ying
    Neural Processing Letters, 2023, 55 : 12547 - 12567