Conditional Independence Induced Unsupervised Domain Adaptation

被引:6
作者
Xu, Xiao-Lin [1 ]
Xu, Geng-Xin [2 ]
Ren, Chuan-Xian [2 ,3 ]
Dai, Dao-Qing [2 ]
Yan, Hong [4 ,5 ]
机构
[1] Guangdong Univ Finance & Econ, Sch Stat & Math, Guangzhou 510320, Peoples R China
[2] Sun Yat Sen Univ, Sch Math, Guangzhou 510275, Peoples R China
[3] Guangdong Prov Key Lab Computat Sci, Guangzhou 510006, Peoples R China
[4] CityU Hong Kong, Ctr Intelligent Multidimens Data Anal, Hong Kong, Peoples R China
[5] CityU Hong Kong, Dept Elect & Engn, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Domain Adaptation; Discriminant Analysis; Feature Learning; Conditional Independence; Classification; NETWORK;
D O I
10.1016/j.patcog.2023.109787
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning domain-adaptive features is important to tackle the dataset bias problem, where data distributions in the labeled source domain and the unlabeled target domain can be different. The critical issue is to identify and then reduce the redundant information including class-irrelevant and domain-specific features. In this paper, a conditional independence induced unsupervised domain adaptation (CIDA) method is proposed to tackle the challenges. It aims to find the low-dimensional and transferable feature representation of each observation, namely the latent variable in the domain-adaptive subspace. Technically, two mutual information terms are optimized at the same time. One is the mutual information between the latent variable and the class label, and the other is the mutual information between the latent variable and the domain label. Note that the key module can be approximately reformulated as a conditional independence/dependence based optimization problem, and thus, it has a probabilistic interpretation with the Gaussian process. Temporary labels of the target samples and the model parameters are alternatively optimized. The objective function can be incorporated with deep network architectures, and the algorithm is implemented iteratively in an end-to-end manner. Extensive experiments are conducted on several benchmark datasets, and the results show effectiveness of CIDA.& COPY; 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页数:14
相关论文
共 67 条
[1]   Unsupervised Domain Adaptation Based on Correlation Maximization [J].
Abdi, Lida ;
Hashemi, Sattar .
IEEE ACCESS, 2021, 9 :127054-127067
[2]   JOINT MEASURES AND CROSS-COVARIANCE OPERATORS [J].
BAKER, CR .
TRANSACTIONS OF THE AMERICAN MATHEMATICAL SOCIETY, 1973, 186 (459) :273-289
[3]   A theory of learning from different domains [J].
Ben-David, Shai ;
Blitzer, John ;
Crammer, Koby ;
Kulesza, Alex ;
Pereira, Fernando ;
Vaughan, Jennifer Wortman .
MACHINE LEARNING, 2010, 79 (1-2) :151-175
[4]  
Bousmalis K, 2016, ADV NEUR IN, V29
[5]   Generative attention adversarial classification network for unsupervised domain adaptation [J].
Chen, Wendong ;
Hu, Haifeng .
PATTERN RECOGNITION, 2020, 107
[6]  
Chen XY, 2019, PR MACH LEARN RES, V97
[7]   Deep conditional adaptation networks and label correlation transfer for unsupervised domain adaptation [J].
Chen, Yu ;
Yang, Chunling ;
Zhang, Yan ;
Li, Yuze .
PATTERN RECOGNITION, 2020, 98
[8]   DeepJDOT: Deep Joint Distribution Optimal Transport for Unsupervised Domain Adaptation [J].
Damodaran, Bharath Bhushan ;
Kellenberger, Benjamin ;
Flamary, Remi ;
Tuia, Devis ;
Courty, Nicolas .
COMPUTER VISION - ECCV 2018, PT IV, 2018, 11208 :467-483
[9]   Active multi-kernel domain adaptation for hyperspectral image classification [J].
Deng, Cheng ;
Liu, Xianglong ;
Li, Chao ;
Tao, Dacheng .
PATTERN RECOGNITION, 2018, 77 :306-315
[10]   Cross-Domain Gradient Discrepancy Minimization for Unsupervised Domain Adaptation [J].
Du, Zhekai ;
Li, Jingjing ;
Su, Hongzu ;
Zhu, Lei ;
Lu, Ke .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :3936-3945