Unsupervised New-set Domain Adaptation with Self-supervised Knowledge

被引:0
作者
Wang Y.-Y. [1 ]
Sun G.-W. [1 ]
Zhao G.-X. [2 ]
Xue H. [3 ]
机构
[1] School of Computer Science, Nanjing University of Posts and Telecommunications, Nanjing
[2] College of Automation, Nanjing University of Posts and Telecommunications, Nanjing
[3] School of Computer Science and Engineering, Southeast University, Nanjing
来源
Ruan Jian Xue Bao/Journal of Software | 2022年 / 33卷 / 04期
关键词
Class contrastive knowledge; Label space; Self-supervised learning; Unsupervised domain adaptation;
D O I
10.13328/j.cnki.jos.006478
中图分类号
学科分类号
摘要
Unsupervised domain adaptation (UDA) adopts source domain with large amounts of labeled data to help the learning of the target domain without any label information. In UDA, the source and target domains usually have different data distribution, but share the same label space. But in real open scenarios, label spaces between domains can also be different. In extreme cases, there is no shared class between domains, i.e., all classes in target domain are new classes. In this case, directly transferring the discriminant knowledge in source domain would harm the performance of target domain, lead to negative transfer. As a result, this study proposes an unsupervised new-set domain adaptation with self-supervised knowledge (SUNDA). Firstly, self-supervised learning is adopted to learn the initial features on source and target domains, with the first few layers frozen, in order to keep the target information. Then, the class contrastive knowledge from the source domain is transferred, to help learning discriminant features for target domain. Moreover, the graph-based self-supervised classification loss is adopted to handle the classification problem in target domain without common labels. Experiments are conducted over both digit and face recognition tasks without shared classes, and the empirical results show the competitive of SUNDA compared with UDA and unsupervised clustering methods, as well as new category discovery method. © Copyright 2022, Institute of Software, the Chinese Academy of Sciences. All rights reserved.
引用
收藏
页码:1170 / 1182
页数:12
相关论文
共 41 条
[1]  
Zhuang F, Qi Z, Duan K, Et al., A comprehensive survey on transfer learning, Proc. of the IEEE, 109, 1, pp. 43-76, (2020)
[2]  
Wilson G, Cook DJ., A survey of unsupervised deep domain adaptation, ACM Trans. on Intelligent Systems and Technology (TIST), 11, 5, pp. 1-46, (2020)
[3]  
Wang W, Zheng VW, Yu H, Et al., A survey of zero-shot learning: Settings, methods, and applications, ACM Trans. on Intelligent Systems and Technology (TIST), 10, 2, pp. 1-37, (2019)
[4]  
Pang N, Zhao X, Wang W, Et al., Few-shot text classification by leveraging bi-directional attention and cross-class knowledge, Sciece China Information Sciences, 64, 3, pp. 29-41, (2021)
[5]  
Han K, Vedaldi A, Zisserman A., Learning to discover novel visual categories via deep transfer clustering, Proc. of the IEEE/ CVF Int'l Conf. on Computer Vision, pp. 8401-8409, (2019)
[6]  
Long M, Cao Y, Wang J, Et al., Learning transferable features with deep adaptation networks, Proc. of the Int'l Conf. on Machine Learning. PMLR, pp. 97-105, (2015)
[7]  
Kullback S, Leibler RA., On information and sufficiency, Annals of Mathematical Statistics, 22, 1, pp. 79-86, (1951)
[8]  
Arjovsky M, Chintala S, Bottou L., Wasserstein generative adversarial networks, Proc. of the Int'l Conf. on Machine Learning. PMLR, pp. 214-223, (2017)
[9]  
Ganin Y, Ustinova E, Ajakan H, Et al., Domain-adversarial training of neural networks, Journal of Machine Learning Research, 17, 59, pp. 1-35, (2016)
[10]  
Wang YY, Gu JM, Wang C, Et al., Discrimination-aware domain adversarial neural network, Journal of Computer Science and Technology, 35, pp. 259-267, (2020)