Domain Adaptation Transfer Learning by Kernel Representation Adaptation

被引:1
|
作者
Chen, Xiaoyi [1 ]
Lengelle, Regis [1 ]
机构
[1] Univ Technol Troyes, Charles Delaunay Inst, Res Team LM2S, CNRS,ROSAS Dept,UMR 6281, 12 Rue Marie Curie,CS 42060, F-10004 Troyes 10004, France
关键词
D O I
10.1007/978-3-319-93647-5_3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain adaptation, where no labeled target data is available, is a challenging task. To solve this problem, we first propose a new SVM based approach with a supplementary Maximum Mean Discrepancy (MMD)-like constraint. With this heuristic, source and target data are projected onto a common subspace of a Reproducing Kernel Hilbert Space (RKHS) where both data distributions are expected to become similar. Therefore, a classifier trained on source data might perform well on target data, if the conditional probabilities of labels are similar for source and target data, which is the main assumption of this paper. We demonstrate that adding this constraint does not change the quadratic nature of the optimization problem, so we can use common quadratic optimization tools. Secondly, using the same idea that rendering source and target data similar might ensure efficient transfer learning, and with the same assumption, a Kernel Principal Component Analysis (KPCA) based transfer learning method is proposed. Different from the first heuristic, this second method ensures other higher order moments to be aligned in the RKHS, which leads to better performances. Here again, we select MMD as the similarity measure. Then, a linear transformation is also applied to further improve the alignment between source and target data. We finally compare both methods with other transfer learning methods from the literature to show their efficiency on synthetic and real datasets.
引用
收藏
页码:45 / 61
页数:17
相关论文
共 50 条
  • [21] Bi-adapting kernel learning for unsupervised domain adaptation
    Wang, Zengmao
    Xiao, Pan
    Tu, Weiping
    Du, Bo
    Cheng, Yanxiang
    NEUROCOMPUTING, 2020, 398 (398) : 547 - 554
  • [22] Kernel Extreme Learning Machine with Discriminative Transfer Feature and Instance Selection for Unsupervised Domain Adaptation
    Zang, Shaofei
    Li, Huimin
    Lu, Nannan
    Ma, Chao
    Gao, Jiwei
    Ma, Jianwei
    Lv, Jinfeng
    NEURAL PROCESSING LETTERS, 2024, 56 (04)
  • [23] LEARNING DISCRIMINATIVE GEODESIC FLOW KERNEL FOR UNSUPERVISED DOMAIN ADAPTATION
    Wei, Jianze
    Liang, Jian
    He, Ran
    Yang, Jinfeng
    2018 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2018,
  • [24] Domain Space Transfer Extreme Learning Machine for Domain Adaptation
    Chen, Yiming
    Song, Shiji
    Li, Shuang
    Yang, Le
    Wu, Cheng
    IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (05) : 1909 - 1922
  • [25] Domain adaptation based transfer learning for patent transfer prediction
    Liu, Weidong
    Wang, Yiming
    Gan, Keqin
    Luo, Xiangfeng
    Zhang, Yu
    Jiang, Cuicui
    KNOWLEDGE-BASED SYSTEMS, 2025, 315
  • [26] Unsupervised domain adaptation via representation learning and adaptive classifier learning
    Gheisari, Marzieh
    Baghshah, Mandieh Soleymani
    NEUROCOMPUTING, 2015, 165 : 300 - 311
  • [27] Discriminative Kernel Matrix for Domain Adaptation
    Razzaghi, Parisa
    Razzaghi, Parvin
    26TH IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE 2018), 2018, : 1530 - 1535
  • [28] Representation learning via an integrated autoencoder for unsupervised domain adaptation
    Yi ZHU
    Xindong WU
    Jipeng QIANG
    Yunhao YUAN
    Yun LI
    Frontiers of Computer Science, 2023, 17 (05) : 77 - 89
  • [29] Kernel Manifold Alignment for Domain Adaptation
    Tuia, Devis
    Camps-Valls, Gustau
    PLOS ONE, 2016, 11 (02):
  • [30] Unsupervised Domain Adaptation in the Wild via Disentangling Representation Learning
    Li, Haoliang
    Wan, Renjie
    Wang, Shiqi
    Kot, Alex C.
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2021, 129 (02) : 267 - 283