Deep conditional adaptation networks and label correlation transfer for unsupervised domain adaptation

被引:26
作者
Chen, Yu [1 ]
Yang, Chunling [1 ]
Zhang, Yan [1 ]
Li, Yuze [1 ]
机构
[1] Harbin Inst Technol, Sch Elect Engn & Automat, Harbin 150001, Heilongjiang, Peoples R China
关键词
Conditional domain adaptation; Deep learning; Unsupervised learning; Label transfer;
D O I
10.1016/j.patcog.2019.107072
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Unsupervised domain adaptation aims to improve the performance of an unknown target domain by utilizing the knowledge learned from a related source domain. Given that the target label information is unavailable in the unsupervised situation, it is challenging to match the domain distributions and to transfer the source model to target applications. In this paper, a Deep Conditional Adaptation Networks (DCAN) is proposed to address the unsupervised domain adaptation problem. DCAN is implemented based on a deep neural network and attempts to learn domain invariant features based on the Wasserstein distance. A conditional adaptation strategy is presented to reduce the domain distribution discrepancy and to address category mismatch and class prior bias, which are usually ignored in marginal adaptation approaches. Furthermore, we propose a label correlation transfer algorithm to address the unsupervised issues, by generating more effective pseudo target labels based on the underlying cross-domain relationship. A set of comparative experiments were performed on standard domain adaptation benchmarks and the results demonstrate that the proposed DCAN outperforms previous adaptation methods. (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页数:11
相关论文
共 40 条
[1]  
[Anonymous], 2006, P ADV NEUR INF PROC
[2]  
[Anonymous], 2017, P INT C MACH LEARN
[3]   Unsupervised Domain Adaptation by Domain Invariant Projection [J].
Baktashmotlagh, Mahsa ;
Harandi, Mehrtash T. ;
Lovell, Brian C. ;
Salzmann, Mathieu .
2013 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2013, :769-776
[4]   Integrating structured biological data by Kernel Maximum Mean Discrepancy [J].
Borgwardt, Karsten M. ;
Gretton, Arthur ;
Rasch, Malte J. ;
Kriegel, Hans-Peter ;
Schoelkopf, Bernhard ;
Smola, Alex J. .
BIOINFORMATICS, 2006, 22 (14) :E49-E57
[5]  
Donahue J, 2014, PR MACH LEARN RES, V32
[6]  
GANIN Y, 2015, ICML, DOI DOI 10.48550/ARXIV.1409.7495
[7]  
Ganin Y, 2016, J MACH LEARN RES, V17
[8]  
Glorot X., 2011, ICML
[9]  
Gopalan R, 2011, IEEE I CONF COMP VIS, P999, DOI 10.1109/ICCV.2011.6126344
[10]   Recent advances in convolutional neural networks [J].
Gu, Jiuxiang ;
Wang, Zhenhua ;
Kuen, Jason ;
Ma, Lianyang ;
Shahroudy, Amir ;
Shuai, Bing ;
Liu, Ting ;
Wang, Xingxing ;
Wang, Gang ;
Cai, Jianfei ;
Chen, Tsuhan .
PATTERN RECOGNITION, 2018, 77 :354-377