TL-ADA: Transferable Loss-based Active Domain Adaptation

被引:10
作者
Han, Kyeongtak [1 ]
Kim, Youngeun [2 ]
Han, Dongyoon [3 ]
Lee, Hojun [1 ]
Hong, Sungeun [1 ]
机构
[1] Inha Univ, Dept Elect & Comp Engn, Incheon, South Korea
[2] Yale Univ, Dept Elect Engn, New Haven, CT USA
[3] NAVER AI Lab, Sungnam, South Korea
基金
新加坡国家研究基金会;
关键词
Active Domain Adaptation; Loss prediction; Pseudo labels; Transferable query selection; Ranking loss;
D O I
10.1016/j.neunet.2023.02.004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The field of Active Domain Adaptation (ADA) has been investigating ways to close the performance gap between supervised and unsupervised learning settings. Previous ADA research has primarily focused on query selection, but there has been little examination of how to effectively train newly labeled target samples using both labeled source samples and unlabeled target samples. In this study, we present a novel Transferable Loss-based ADA (TL-ADA) framework. Our approach is inspired by loss-based query selection, which has shown promising results in active learning. However, directly applying loss-based query selection to the ADA scenario leads to a buildup of high-loss samples that do not contribute to the model due to transferability issues and low diversity. To address these challenges, we propose a transferable doubly nested loss, which incorporates target pseudo labels and a domain adversarial loss. Our TL-ADA framework trains the model sequentially, considering both the domain type (source/target) and the availability of labels (labeled/unlabeled). Additionally, we encourage the pseudo labels to have low self-entropy and diverse class distributions to improve their reliability. Experiments on several benchmark datasets demonstrate that our TL-ADA model outperforms previous ADA methods, and in-depth analysis supports the effectiveness of our proposed approach. (c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页码:670 / 681
页数:12
相关论文
共 48 条
[1]   The power of ensembles for active learning in image classification [J].
Beluch, William H. ;
Genewein, Tim ;
Nuernberger, Andreas ;
Koehler, Jan M. .
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, :9368-9377
[2]  
Bilgic Mustafa, 2009, NIPS WORKSH AN NETW, V4
[3]  
Chattopadhyay R., 2013, INT C MACH LEARN, V28, P253
[4]  
Dagan I., 1995, Machine Learning. Proceedings of the Twelfth International Conference on Machine Learning, P150
[5]  
Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848
[6]   Transferable Query Selection for Active Domain Adaptation [J].
Fu, Bo ;
Cao, Zhangjie ;
Wang, Jianmin ;
Long, Mingsheng .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :7268-7277
[7]  
Ganin Y, 2015, PR MACH LEARN RES, V37, P1180
[8]  
Gong BQ, 2012, PROC CVPR IEEE, P2066, DOI 10.1109/CVPR.2012.6247911
[9]  
Guo Y, 2010, ADV NEURAL INF PROCE, V23
[10]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778