Semi-Supervised Domain Adaptation via Asymmetric Joint Distribution Matching

被引:23
作者
Chen, Sentao [1 ]
Harandi, Mehrtash [2 ,3 ]
Jin, Xiaona [1 ]
Yang, Xiaowei [1 ]
机构
[1] South China Univ Technol, Sch Software Engn, Guangzhou 510006, Peoples R China
[2] Monash Univ, Dept Elect & Comp Syst Engn, Clayton, Vic 3800, Australia
[3] Data61 CSIRO, Canberra, ACT 2601, Australia
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Manifolds; Optimization; Adaptation models; Predictive models; Data models; Least mean squares methods; Kernel; Feature mapping; joint distribution matching; Riemannian optimization; semi-supervised domain adaptation (SSDA);
D O I
10.1109/TNNLS.2020.3027364
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An intrinsic problem in domain adaptation is the joint distribution mismatch between the source and target domains. Therefore, it is crucial to match the two joint distributions such that the source domain knowledge can be properly transferred to the target domain. Unfortunately, in semi-supervised domain adaptation (SSDA) this problem still remains unsolved. In this article, we therefore present an asymmetric joint distribution matching (AJDM) approach, which seeks a couple of asymmetric matrices to linearly match the source and target joint distributions under the relative chi-square divergence. Specifically, we introduce a least square method to estimate the divergence, which is free from estimating the two joint distributions. Furthermore, we show that our AJDM approach can be generalized to a kernel version, enabling it to handle nonlinearity in the data. From the perspective of Riemannian geometry, learning the linear and nonlinear mappings are both formulated as optimization problems defined on the product of Riemannian manifolds. Numerical experiments on synthetic and real-world data sets demonstrate the effectiveness of the proposed approach and testify its superiority over existing SSDA techniques.
引用
收藏
页码:5708 / 5722
页数:15
相关论文
共 52 条
  • [1] Absil PA, 2008, OPTIMIZATION ALGORITHMS ON MATRIX MANIFOLDS, P1
  • [2] [Anonymous], 2001, LEARNING KERNELS
  • [3] [Anonymous], L2 CONSTRAINED SOFTM
  • [4] [Anonymous], 2008, A literature survey on domain adaptation of statistical classifiers
  • [5] Baktashmotlagh M, 2016, J MACH LEARN RES, V17
  • [6] Domain Adaptation on the Statistical Manifold
    Baktashmotlagh, Mahsa
    Harandi, Mehrtash T.
    Lovell, Brian C.
    Salzmann, Mathieu
    [J]. 2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2014, : 2481 - 2488
  • [7] Bickel Steffen., 2009, ADV NEURAL INFORM PR, V21, P145
  • [8] Semi-supervised Domain Adaptation on Manifolds
    Cheng, Li
    Pan, Sinno Jialin
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (12) : 2240 - 2249
  • [9] Optimal Transport for Domain Adaptation
    Courty, Nicolas
    Flamary, Remi
    Tuia, Devis
    Rakotomamonjy, Alain
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (09) : 1853 - 1865
  • [10] Daume III H., 2010, P ACL, P53