HOMDA: High-Order Moment-Based Domain Alignment for unsupervised domain adaptation

被引:28
作者
Dan, Jun [1 ]
Jin, Tao [1 ]
Chi, Hao [2 ]
Shen, Yixuan [1 ]
Yu, Jiawang [1 ]
Zhou, Jinhai [1 ]
机构
[1] Zhejiang Univ, Coll Informat Sci & Elect Engn, Hangzhou 310027, Peoples R China
[2] Hangzhou Dianzi Univ, Sch Commun Engn, Hangzhou 310018, Peoples R China
基金
中国国家自然科学基金;
关键词
Transfer learning; Domain adaptation; Optimal transport; High-order moment; Discriminative feature learning; RECOGNITION;
D O I
10.1016/j.knosys.2022.110205
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Unsupervised domain adaptation aims to annotate unlabeled target domain samples by utilizing transferable knowledge learned from the source domain. Optimal transport (OT) has become a popular probability metric to measure the distribution discrepancy across domains. However, most OT-based methods inevitably suffer from false matching problem, resulting in poor domain alignment. Furthermore, because the domain shift cannot be completely eliminated, some target domain data distributed near cluster edges or far from their corresponding class centers are highly likely to be misclassified by the decision hyperplane learned from the source samples. We propose a High-Order Moment-Based Domain Alignment (HOMDA) method that uses an improved partial optimal transport (POT) strategy and a discriminative centroid-wise clustering regularization scheme to solve these issues. The improved POT strategy performs fine-grained domain alignment by limiting the amount of masses during mapping. The centroid-wise clustering scheme can promote the shared features with better intra-class compactness and inter-class separability, effectively reducing negative transfer caused by hard-aligned target samples. To better capture discriminative representations, HOMDA also includes a high-order statistics mixer. Extensive experiments show that our proposal can improve adaptation performance significantly and is competitive with state-of-the-art methods.(c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:12
相关论文
共 84 条
[1]   Partial Adversarial Domain Adaptation [J].
Cao, Zhangjie ;
Ma, Lijia ;
Long, Mingsheng ;
Wang, Jianmin .
COMPUTER VISION - ECCV 2018, PT VIII, 2018, 11212 :139-155
[2]   Learning to Transfer Examples for Partial Domain Adaptation [J].
Cao, Zhangjie ;
You, Kaichao ;
Long, Mingsheng ;
Wang, Jianmin ;
Yang, Qiang .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :2980-2989
[3]   Partial Transfer Learning with Selective Adversarial Networks [J].
Cao, Zhangjie ;
Long, Mingsheng ;
Wang, Jianmin ;
Jordan, Michael I. .
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, :2724-2732
[4]  
Chapel L., 2020, ADV NEURAL INFORM PR, V33, P2903
[5]  
Chen C, 2020, AAAI CONF ARTIF INTE, V34, P3422
[6]  
Chen C, 2019, AAAI CONF ARTIF INTE, P3296
[7]   ESAM: Discriminative Domain Adaptation with Non-Displayed Items to Improve Long-Tail Performance [J].
Chen, Zhihong ;
Xiao, Rong ;
Li, Chenliang ;
Ye, Gangfeng ;
Sun, Haochuan ;
Deng, Hongbo .
PROCEEDINGS OF THE 43RD INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '20), 2020, :579-588
[8]   Robust and high-order correlation alignment for unsupervised domain adaptation [J].
Cheng, Zhaowei ;
Chen, Chao ;
Chen, Zhihong ;
Fang, Ke ;
Jin, Xinyu .
NEURAL COMPUTING & APPLICATIONS, 2021, 33 (12) :6891-6903
[9]  
Chi H., 2021, ADV NEUR IN
[10]  
Courty Nicolas, 2014, Machine Learning and Knowledge Discovery in Databases. European Conference, ECML PKDD 2014. Proceedings: LNCS 8724, P274, DOI 10.1007/978-3-662-44848-9_18