Pseudo-labeling Integrating Centers and Samples with Consistent Selection Mechanism for Unsupervised Domain Adaptation

被引:18
作者
Li, Lei [1 ,2 ]
Yang, Jun [2 ]
Ma, Yulin [2 ]
Kong, Xuefeng [3 ]
机构
[1] Beihang Univ, Sch Automat Sci & Elect Engn, Beijing, Peoples R China
[2] Beihang Univ, Sch Reliabil & Syst Engn, Beijing, Peoples R China
[3] Zhejiang Sci Tech Univ, Sch Mech Engn & Automat, Zhejiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Center information; Consistent selection mechanism; Pseudo-labeling; Sample information; Unsupervised domain adaptation; NETWORK;
D O I
10.1016/j.ins.2023.01.109
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Pseudo-labeling is widely applied to generate pseudo labels of target samples in most Unsuper-vised Domain Adaptation (UDA) methods. Existing UDA methods designed the pseudo-labeling strategy using the label information from a single source (sample or center information), which ignored the joint effect of the center and sample information on improving the robustness of pseudo-labeling. To address this issue, we propose Pseudo-labeling Integrating Centers and Samples with Consistent Selection mechanism (PICSCS) for UDA. First, PICSCS assigns the label vector with (1 + K) pseudo labels to the target sample. Specifically, the 1 pseudo label represents the center information, which is determined by its nearest class center. The K pseudo labels symbolize adequate sample information, and they are determined by its K nearest source samples. Second, to use the label information from different sources in pseudo-labeling, PICSCS defines the consistent selection mechanism by judging whether (1 + K) pseudo labels in the label vector are the same. Then, label vectors can be identified as consistent or inconsistent, and only target samples with consistent label vectors are adopted in the iteration. Finally, extensive experiments on four benchmark datasets (ImageCLEF-DA, Office-31, Office-Caltech, and Office-Home) show that PICSCS makes the iteration stable, and PICSCS outperforms the state-of-the-art UDA methods.
引用
收藏
页码:50 / 69
页数:20
相关论文
共 48 条
[1]  
[Anonymous], 1967, P 5 BERK S MATH STAT
[2]   Adversarial Domain Adaptation with Semantic Consistency for Cross-Domain Image Classification [J].
Cao, Manliang ;
Zhou, Xiangdong ;
Xu, Yiming ;
Pang, Yue ;
Yao, Bo .
PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, :259-268
[3]   A Graph Embedding Framework for Maximum Mean Discrepancy-Based Domain Adaptation Algorithms [J].
Chen, Yiming ;
Song, Shiji ;
Li, Shuang ;
Wu, Cheng .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 :199-213
[4]  
Dai W., 2007, P 24 INT C MACH LEAR, P193, DOI [10.1145/1273496.1273521, DOI 10.1145/1273496.1273521]
[5]   Deep ladder reconstruction-classification network for unsupervised domain adaptation [J].
Deng, Wanxia ;
Su, Zhuo ;
Qiu, Qiang ;
Zhao, Lingjun ;
Kuang, Gangyao ;
Pietikainen, Matti ;
Xiao, Huaxin ;
Liu, Li .
PATTERN RECOGNITION LETTERS, 2021, 152 :398-405
[6]  
Donahue J, 2014, PR MACH LEARN RES, V32
[7]   Reliable Domain Adaptation With Classifiers Competition for Image Classification [J].
Fu, Jingru ;
Zhang, Lei .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2020, 67 (11) :2802-2806
[8]  
Ganin Y, 2016, J MACH LEARN RES, V17
[9]   Label propagation with multi-stage inference for visual domain adaptation [J].
Han, Chao ;
Zhou, Deyun ;
Xie, Yu ;
Lei, Yu ;
Shi, Jiao .
KNOWLEDGE-BASED SYSTEMS, 2021, 216
[10]   Collaborative representation with curriculum classifier boosting for unsupervised domain adaptation [J].
Han, Chao ;
Zhou, Deyun ;
Xie, Yu ;
Gong, Maoguo ;
Lei, Yu ;
Shi, Jiao .
PATTERN RECOGNITION, 2021, 113