ECDT: Exploiting Correlation Diversity for Knowledge Transfer in Partial Domain Adaptation

被引:0
作者
He, Shichang [1 ]
Liu, Xuan [1 ,2 ]
Chen, Xinning [1 ]
Huang, Ying [1 ]
机构
[1] Hunan Univ, Coll Comp Sci & Elect Engn, Changsha, Hunan, Peoples R China
[2] Tsinghua Univ, Sci & Technol Paraller & Distributed Proc Lab, Beijing, Peoples R China
来源
2020 16TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2020) | 2020年
基金
中国国家自然科学基金;
关键词
Transfer learning; Neural networks; Domain adaptation; Samples tagging;
D O I
10.1109/MSN50589.2020.00127
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Domain adaptation aims to transfer knowledge across different domains and bridge the gap between them. While traditional knowledge transfer considers identical domain, a more realistic scenario is to transfer from a larger and more diverse source domain to a smaller target domain, which is referred to as partial domain adaptation (PDA). However, matching the whole source domain to the target domain for PDA might produce negative transfer. Samples in the shared classes should be carefully selected to mitigate negative transfer in PDA. We observe that the correlations between different target domain samples and source domain samples are diverse: classes are not equally correlated and moreover, different samples have different correlation strengthes even when they are in the same class. In this study, we propose ECDT, a novel PDA method that Exploits the Correlation Diversity for knowledge Transfer between different domains. We propose a novel method to estimate target domain label space that utilizes the label distribution and feature distribution of target samples, based on which outlier source classes can be filtered out and their negative effects on transfer can be mitigated. Moreover, ECDT combines class-level correlation and instance- level correlation to quantify sample-level transferability in domain adversarial network. Experimental results on three commonly used cross-domain object data sets show that ECDT is superior to previous partial domain adaptation methods.
引用
收藏
页码:746 / 751
页数:6
相关论文
共 22 条
  • [1] [Anonymous], 2016, Proceedings of the 30th International Conference on Neural Information Processing Systems, DOI DOI 10.5555/3157096.3157149
  • [2] A theory of learning from different domains
    Ben-David, Shai
    Blitzer, John
    Crammer, Koby
    Kulesza, Alex
    Pereira, Fernando
    Vaughan, Jennifer Wortman
    [J]. MACHINE LEARNING, 2010, 79 (1-2) : 151 - 175
  • [3] Partial Adversarial Domain Adaptation
    Cao, Zhangjie
    Ma, Lijia
    Long, Mingsheng
    Wang, Jianmin
    [J]. COMPUTER VISION - ECCV 2018, PT VIII, 2018, 11212 : 139 - 155
  • [4] Learning to Transfer Examples for Partial Domain Adaptation
    Cao, Zhangjie
    You, Kaichao
    Long, Mingsheng
    Wang, Jianmin
    Yang, Qiang
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2980 - 2989
  • [5] Partial Transfer Learning with Selective Adversarial Networks
    Cao, Zhangjie
    Long, Mingsheng
    Wang, Jianmin
    Jordan, Michael I.
    [J]. 2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 2724 - 2732
  • [6] Chen C., 2019, SELECTIVE TRANSFER R
  • [7] Chen J., 2019, DOMAIN ADVERSARIAL R
  • [8] Large Scale Fine-Grained Categorization and Domain-Specific Transfer Learning
    Cui, Yin
    Song, Yang
    Sun, Chen
    Howard, Andrew
    Belongie, Serge
    [J]. 2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 4109 - 4118
  • [9] Ganin Y, 2016, J MACH LEARN RES, V17
  • [10] Image Style Transfer Using Convolutional Neural Networks
    Gatys, Leon A.
    Ecker, Alexander S.
    Bethge, Matthias
    [J]. 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 2414 - 2423