Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation

被引:0
作者
Zhang, Yangkuiyi [1 ]
Tang, Song [1 ]
机构
[1] Univ Shanghai Sci & Technol, IMI Grp, Shanghai 200093, Peoples R China
关键词
domain adaptation; source-data-free; geometry-guided; gradual knowledge distillation; object recognition; ENTROPY;
D O I
10.3390/math13091491
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Due to access to the source data during the transfer phase, conventional domain adaptation works have recently raised safety and privacy concerns. More research attention thus shifts to a more practical setting known as source-data-free domain adaptation (SFDA). The new challenge is how to obtain reliable semantic supervision in the absence of source domain training data and the labels on the target domain. To that end, in this work, we introduce a novel Gradual Geometry-Guided Knowledge Distillation (G2KD) approach for SFDA. Specifically, to address the lack of supervision, we used local geometry of data to construct a more credible probability distribution over the potential categories, termed geometry-guided knowledge. Then, knowledge distillation was adopted to integrate this extra information for boosting the adaptation. More specifically, first, we constructed a neighborhood geometry for any target data using a similarity comparison on the whole target dataset. Second, based on pre-obtained semantic estimation by clustering, we mined soft semantic representations expressing the geometry-guided knowledge by semantic fusion. Third, using the soften labels, we performed knowledge distillation regulated by the new objective. Considering the unsupervised setting of SFDA, in addition to the distillation loss and student loss, we introduced a mixed entropy regulator that minimized the entropy of individual data as well as maximized the mutual entropy with augmentation data to utilize neighbor relation. Our contribution is that, through local geometry discovery with semantic representation and self-knowledge distillation, the semantic information hidden in the local structures is transformed to effective semantic self-supervision. Also, our knowledge distillation works in a gradual way that is helpful to capture the dynamic variations in the local geometry, mitigating the previous guidance degradation and deviation at the same time. Extensive experiments on five challenging benchmarks confirmed the state-of-the-art performance of our method.
引用
收藏
页数:24
相关论文
共 80 条
[1]   Unsupervised Multi-source Domain Adaptation Without Access to Source Data [J].
Ahmed, Sk Miraj ;
Raychaudhuri, Dripta S. ;
Paul, Sujoy ;
Oymak, Samet ;
Roy-Chowdhury, Amit K. .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :10098-10107
[2]  
Berthelot D, 2019, ADV NEUR IN, V32
[3]  
Caseiro R, 2015, PROC CVPR IEEE, P3846, DOI 10.1109/CVPR.2015.7299009
[4]  
Chen XY, 2019, PR MACH LEARN RES, V97
[5]   A Style and Semantic Memory Mechanism for Domain Generalization [J].
Chen, Yang ;
Wang, Yu ;
Pan, Yingwei ;
Yao, Ting ;
Tian, Xinmei ;
Mei, Tao .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :9144-9153
[6]   Domain Adaptation in the Absence of Source Domain Data [J].
Chidlovskii, Boris ;
Clinchant, Stephane ;
Csurka, Gabriela .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :451-460
[7]   Towards Discriminability and Diversity: Batch Nuclear-norm Maximization under Label Insufficient Situations [J].
Cui, Shuhao ;
Wang, Shuhui ;
Zhuo, Junbao ;
Li, Liang ;
Huang, Qingming ;
Tian, Qi .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, :3940-3949
[8]   Deep Clustering via Joint Convolutional Autoencoder Embedding and Relative Entropy Minimization [J].
Dizaji, Kamran Ghasedi ;
Herandi, Amirhossein ;
Deng, Cheng ;
Cai, Weidong ;
Huang, Heng .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :5747-5756
[9]  
Dosovitskiy A, 2021, Arxiv, DOI arXiv:2010.11929
[10]  
Du Y., 2021, arXiv, DOI [DOI 10.1007/S10994-023-06432-8, 10.1007/s10994-023-06432-8]