Unsupervised Domain Adaptation Enhanced by Fuzzy Prompt Learning

被引:3
|
作者
Shi, Kuo [1 ]
Lu, Jie [1 ]
Fang, Zhen [1 ]
Zhang, Guangquan [1 ]
机构
[1] Univ Technol Sydney, Australian Artificial Intelligence Inst, Faulty Engn & Informat Technol, Sydney, NSW 2007, Australia
基金
澳大利亚研究理事会;
关键词
Task analysis; Adaptation models; Vectors; Training data; Fuzzy systems; Probability distribution; Feature extraction; Domain adaptation; fuzzy clustering; prompt learning; transfer learning;
D O I
10.1109/TFUZZ.2024.3389705
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Unsupervised domain adaptation (UDA) addresses the challenge of distribution shift between a labeled source domain and an unlabeled target domain by utilizing knowledge from the source. Traditional UDA methods mainly focus on single-modal scenarios, either vision or language, thus, not fully exploring the advantages of multimodal representations. Visionlanguage models utilize multimodal information, applying prompt learning techniques for addressing target domain tasks. Motivated by the recent advancements in pretrained visionlanguage models, this article expands the UDA framework to incorporate multimodal approaches using fuzzy techniques. The adoption of fuzzy techniques, preferred over conventional domain adaptation methods, is based on the following two key aspects: 1) the nature of prompt learning is intrinsically linked to fuzzy logic, and 2) the superior capability of fuzzy techniques in processing soft information and effectively utilizing inherent relationships both within and across domains. To this end, we propose UDA enhanced by fuzzy prompt learning (FUZZLE), a simple and effective method for aligning the source and target domains via domain-specific prompt learning. Specifically, we introduce a novel technique to enhance prompt learning in the target domain. This method integrates fuzzy C-means clustering and a novel instance-level fuzzy vector into the prompt learning loss function, minimizing the distance between prompt cluster centers and instance prompts, thereby, enhancing the prompt learning process. In addition, we propose a Kullback-Leibler (KL) divergence-based loss function with a fuzzification factor. This function is designed to minimize the distribution discrepancy in the classification of similar cross-domain data, aligning domain-specific prompts during the training process. We contribute an in-depth analysis to understand the effectiveness of FUZZLE. Extensive experiments demonstrate that our method achieves superior performance on standard UDA benchmarks.
引用
收藏
页码:4038 / 4048
页数:11
相关论文
共 50 条
  • [1] Domain Adaptation via Prompt Learning
    Ge, Chunjiang
    Huang, Rui
    Xie, Mixue
    Lai, Zihang
    Song, Shiji
    Li, Shuang
    Huang, Gao
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1160 - 1170
  • [2] Adversarial Learning and Interpolation Consistency for Unsupervised Domain Adaptation
    Zhao, Xin
    Wang, Shengsheng
    IEEE ACCESS, 2019, 7 : 170448 - 170456
  • [3] Guide Subspace Learning for Unsupervised Domain Adaptation
    Zhang, Lei
    Fu, Jingru
    Wang, Shanshan
    Zhang, David
    Dong, Zhaoyang
    Chen, C. L. Philip
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (09) : 3374 - 3388
  • [4] Low-Rank Correlation Learning for Unsupervised Domain Adaptation
    Lu, Yuwu
    Wong, Wai Keung
    Yuan, Chun
    Lai, Zhihui
    Li, Xuelong
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 4153 - 4167
  • [5] Iterative Soft Prompt-Tuning for Unsupervised Domain Adaptation
    Zhu, Yi
    Wang, Shuqin
    Qiang, Jipeng
    Wu, Xindong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (12) : 8580 - 8592
  • [6] Domain Prompt Tuning via Meta Relabeling for Unsupervised Adversarial Adaptation
    Jin, Xin
    Lan, Cuiling
    Zeng, Wenjun
    Chen, Zhibo
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 8333 - 8347
  • [7] When Adversarial Training Meets Prompt Tuning: Adversarial Dual Prompt Tuning for Unsupervised Domain Adaptation
    Cui, Chaoran
    Liu, Ziyi
    Gong, Shuai
    Zhu, Lei
    Zhang, Chunyun
    Liu, Hui
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2025, 34 : 1427 - 1440
  • [8] Transferable Feature Selection for Unsupervised Domain Adaptation
    Yan, Yuguang
    Wu, Hanrui
    Ye, Yuzhong
    Bi, Chaoyang
    Lu, Min
    Liu, Dapeng
    Wu, Qingyao
    Ng, Michael K.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (11) : 5536 - 5551
  • [9] Semantic-Aware Adaptive Prompt Learning for Universal Multi-Source Domain Adaptation
    Yang, Yuxiang
    Hou, Yun
    Wen, Lu
    Zeng, Pinxian
    Wang, Yan
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 1444 - 1448
  • [10] Informative Feature Disentanglement for Unsupervised Domain Adaptation
    Deng, Wanxia
    Zhao, Lingjun
    Liao, Qing
    Guo, Deke
    Kuang, Gangyao
    Hu, Dewen
    Pietikainen, Matti
    Liu, Li
    IEEE TRANSACTIONS ON MULTIMEDIA, 2022, 24 : 2407 - 2421