On the Importance of Attention and Augmentations for Hypothesis Transfer in Domain Adaptation and Generalization

被引:3
|
作者
Thomas, Georgi [1 ]
Sahay, Rajat [1 ]
Jahan, Chowdhury Sadman [1 ]
Manjrekar, Mihir [1 ]
Popp, Dan [1 ]
Savakis, Andreas [1 ]
机构
[1] Rochester Inst Technol, Rochester, NY 14623 USA
关键词
domain adaptation; domain generalization; vision transformers; convolutional neural networks;
D O I
10.3390/s23208409
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Unsupervised domain adaptation (UDA) aims to mitigate the performance drop due to the distribution shift between the training and testing datasets. UDA methods have achieved performance gains for models trained on a source domain with labeled data to a target domain with only unlabeled data. The standard feature extraction method in domain adaptation has been convolutional neural networks (CNNs). Recently, attention-based transformer models have emerged as effective alternatives for computer vision tasks. In this paper, we benchmark three attention-based architectures, specifically vision transformer (ViT), shifted window transformer (SWIN), and dual attention vision transformer (DAViT), against convolutional architectures ResNet, HRNet and attention-based ConvNext, to assess the performance of different backbones for domain generalization and adaptation. We incorporate these backbone architectures as feature extractors in the source hypothesis transfer (SHOT) framework for UDA. SHOT leverages the knowledge learned in the source domain to align the image features of unlabeled target data in the absence of source domain data, using self-supervised deep feature clustering and self-training. We analyze the generalization and adaptation performance of these models on standard UDA datasets and aerial UDA datasets. In addition, we modernize the training procedure commonly seen in UDA tasks by adding image augmentation techniques to help models generate richer features. Our results show that ConvNext and SWIN offer the best performance, indicating that the attention mechanism is very beneficial for domain generalization and adaptation with both transformer and convolutional architectures. Our ablation study shows that our modernized training recipe, within the SHOT framework, significantly boosts performance on aerial datasets.
引用
收藏
页数:22
相关论文
共 50 条
  • [41] Source-free domain adaptation with unrestricted source hypothesis
    He, Jiujun
    Wu, Liang
    Tao, Chaofan
    Lv, Fengmao
    PATTERN RECOGNITION, 2024, 149
  • [42] SADGFeat: Learning local features with layer spatial attention and domain generalization
    Bai, Wenjing
    Zhang, Yunzhou
    Wang, Li
    Liu, Wei
    Hu, Jun
    Huang, Guan
    IMAGE AND VISION COMPUTING, 2024, 146
  • [43] Domain generalization person re-identification based on attention mechanism
    Yu M.
    Li X.-B.
    Guo Y.-C.
    Kongzhi yu Juece/Control and Decision, 2022, 37 (07): : 1721 - 1728
  • [44] Domain Space Transfer Extreme Learning Machine for Domain Adaptation
    Chen, Yiming
    Song, Shiji
    Li, Shuang
    Yang, Le
    Wu, Cheng
    IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (05) : 1909 - 1922
  • [45] Domain generalization and adaptation based on second-order style information
    Wang, Hao
    Bi, Xiaojun
    PATTERN RECOGNITION, 2022, 127
  • [46] Cross-Domain Attention Network for Unsupervised Domain Adaptation Crowd Counting
    Zhang, Anran
    Xu, Jun
    Luo, Xiaoyan
    Cao, Xianbin
    Zhen, Xiantong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (10) : 6686 - 6699
  • [47] Transfer Neural Trees for Heterogeneous Domain Adaptation
    Chen, Wei-Yu
    Hsu, Tzu-Ming Harry
    Tsai, Yao-Hung Hubert
    Wang, Yu-Chiang Frank
    Chen, Ming-Syan
    COMPUTER VISION - ECCV 2016, PT V, 2016, 9909 : 399 - 414
  • [48] Locality Preserving Joint Transfer for Domain Adaptation
    Li, Jingjing
    Jing, Mengmeng
    Lu, Ke
    Zhu, Lei
    Shen, Heng Tao
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (12) : 6103 - 6115
  • [49] Domain adaptation based transfer learning for patent transfer prediction
    Liu, Weidong
    Wang, Yiming
    Gan, Keqin
    Luo, Xiangfeng
    Zhang, Yu
    Jiang, Cuicui
    KNOWLEDGE-BASED SYSTEMS, 2025, 315
  • [50] Domain Adaptation via Transfer Component Analysis
    Pan, Sinno Jialin
    Tsang, Ivor W.
    Kwok, James T.
    Yang, Qiang
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (02): : 199 - 210